00:00:00.001 Started by upstream project "autotest-spdk-v24.01-LTS-vs-dpdk-v23.11" build number 626 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3292 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.040 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.040 The recommended git tool is: git 00:00:00.040 using credential 00000000-0000-0000-0000-000000000002 00:00:00.043 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.054 Fetching changes from the remote Git repository 00:00:00.055 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.077 Using shallow fetch with depth 1 00:00:00.077 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.077 > git --version # timeout=10 00:00:00.113 > git --version # 'git version 2.39.2' 00:00:00.113 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.163 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.163 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:03.305 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:03.315 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:03.326 Checking out Revision f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08 (FETCH_HEAD) 00:00:03.326 > git config core.sparsecheckout # timeout=10 00:00:03.335 > git read-tree -mu HEAD # timeout=10 00:00:03.351 > git checkout -f f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08 # timeout=5 00:00:03.371 Commit message: "spdk-abi-per-patch: fix check-so-deps-docker-autotest parameters" 00:00:03.371 > git rev-list --no-walk f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08 # timeout=10 00:00:03.453 [Pipeline] Start of Pipeline 00:00:03.465 [Pipeline] library 00:00:03.466 Loading library shm_lib@master 00:00:03.467 Library shm_lib@master is cached. Copying from home. 00:00:03.483 [Pipeline] node 00:00:03.494 Running on WFP39 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:03.496 [Pipeline] { 00:00:03.508 [Pipeline] catchError 00:00:03.509 [Pipeline] { 00:00:03.520 [Pipeline] wrap 00:00:03.529 [Pipeline] { 00:00:03.535 [Pipeline] stage 00:00:03.536 [Pipeline] { (Prologue) 00:00:03.706 [Pipeline] sh 00:00:03.985 + logger -p user.info -t JENKINS-CI 00:00:04.001 [Pipeline] echo 00:00:04.002 Node: WFP39 00:00:04.010 [Pipeline] sh 00:00:04.310 [Pipeline] setCustomBuildProperty 00:00:04.325 [Pipeline] echo 00:00:04.327 Cleanup processes 00:00:04.331 [Pipeline] sh 00:00:04.611 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.611 3037143 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.622 [Pipeline] sh 00:00:04.902 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.902 ++ grep -v 'sudo pgrep' 00:00:04.902 ++ awk '{print $1}' 00:00:04.902 + sudo kill -9 00:00:04.902 + true 00:00:04.917 [Pipeline] cleanWs 00:00:04.925 [WS-CLEANUP] Deleting project workspace... 00:00:04.925 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.931 [WS-CLEANUP] done 00:00:04.934 [Pipeline] setCustomBuildProperty 00:00:04.942 [Pipeline] sh 00:00:05.275 + sudo git config --global --replace-all safe.directory '*' 00:00:05.340 [Pipeline] httpRequest 00:00:05.356 [Pipeline] echo 00:00:05.358 Sorcerer 10.211.164.101 is alive 00:00:05.364 [Pipeline] httpRequest 00:00:05.387 HttpMethod: GET 00:00:05.387 URL: http://10.211.164.101/packages/jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:05.388 Sending request to url: http://10.211.164.101/packages/jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:05.394 Response Code: HTTP/1.1 200 OK 00:00:05.394 Success: Status code 200 is in the accepted range: 200,404 00:00:05.395 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:22.540 [Pipeline] sh 00:00:22.825 + tar --no-same-owner -xf jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:22.841 [Pipeline] httpRequest 00:00:22.868 [Pipeline] echo 00:00:22.870 Sorcerer 10.211.164.101 is alive 00:00:22.879 [Pipeline] httpRequest 00:00:22.884 HttpMethod: GET 00:00:22.884 URL: http://10.211.164.101/packages/spdk_dbef7efacb6f3438cd0fe1344a67946669fb1419.tar.gz 00:00:22.885 Sending request to url: http://10.211.164.101/packages/spdk_dbef7efacb6f3438cd0fe1344a67946669fb1419.tar.gz 00:00:22.894 Response Code: HTTP/1.1 200 OK 00:00:22.894 Success: Status code 200 is in the accepted range: 200,404 00:00:22.895 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_dbef7efacb6f3438cd0fe1344a67946669fb1419.tar.gz 00:01:17.551 [Pipeline] sh 00:01:17.833 + tar --no-same-owner -xf spdk_dbef7efacb6f3438cd0fe1344a67946669fb1419.tar.gz 00:01:22.034 [Pipeline] sh 00:01:22.313 + git -C spdk log --oneline -n5 00:01:22.313 dbef7efac test: fix dpdk builds on ubuntu24 00:01:22.313 4b94202c6 lib/event: Bug fix for framework_set_scheduler 00:01:22.313 507e9ba07 nvme: add lock_depth for ctrlr_lock 00:01:22.313 62fda7b5f nvme: check pthread_mutex_destroy() return value 00:01:22.313 e03c164a1 nvme: add nvme_ctrlr_lock 00:01:22.329 [Pipeline] withCredentials 00:01:22.338 > git --version # timeout=10 00:01:22.349 > git --version # 'git version 2.39.2' 00:01:22.366 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:01:22.368 [Pipeline] { 00:01:22.377 [Pipeline] retry 00:01:22.379 [Pipeline] { 00:01:22.394 [Pipeline] sh 00:01:22.673 + git ls-remote http://dpdk.org/git/dpdk-stable v23.11 00:01:23.253 [Pipeline] } 00:01:23.280 [Pipeline] // retry 00:01:23.285 [Pipeline] } 00:01:23.307 [Pipeline] // withCredentials 00:01:23.320 [Pipeline] httpRequest 00:01:23.350 [Pipeline] echo 00:01:23.351 Sorcerer 10.211.164.101 is alive 00:01:23.359 [Pipeline] httpRequest 00:01:23.364 HttpMethod: GET 00:01:23.365 URL: http://10.211.164.101/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:23.365 Sending request to url: http://10.211.164.101/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:23.429 Response Code: HTTP/1.1 200 OK 00:01:23.430 Success: Status code 200 is in the accepted range: 200,404 00:01:23.430 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:30.831 [Pipeline] sh 00:01:31.116 + tar --no-same-owner -xf dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:33.035 [Pipeline] sh 00:01:33.320 + git -C dpdk log --oneline -n5 00:01:33.320 eeb0605f11 version: 23.11.0 00:01:33.320 238778122a doc: update release notes for 23.11 00:01:33.320 46aa6b3cfc doc: fix description of RSS features 00:01:33.320 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:01:33.320 7e421ae345 devtools: support skipping forbid rule check 00:01:33.330 [Pipeline] } 00:01:33.348 [Pipeline] // stage 00:01:33.355 [Pipeline] stage 00:01:33.357 [Pipeline] { (Prepare) 00:01:33.377 [Pipeline] writeFile 00:01:33.396 [Pipeline] sh 00:01:33.680 + logger -p user.info -t JENKINS-CI 00:01:33.694 [Pipeline] sh 00:01:33.977 + logger -p user.info -t JENKINS-CI 00:01:33.990 [Pipeline] sh 00:01:34.272 + cat autorun-spdk.conf 00:01:34.272 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:34.272 SPDK_RUN_UBSAN=1 00:01:34.272 SPDK_TEST_FUZZER=1 00:01:34.272 SPDK_TEST_FUZZER_SHORT=1 00:01:34.272 SPDK_TEST_NATIVE_DPDK=v23.11 00:01:34.272 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:34.279 RUN_NIGHTLY=1 00:01:34.284 [Pipeline] readFile 00:01:34.310 [Pipeline] withEnv 00:01:34.312 [Pipeline] { 00:01:34.326 [Pipeline] sh 00:01:34.645 + set -ex 00:01:34.645 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:34.645 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:34.645 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:34.645 ++ SPDK_RUN_UBSAN=1 00:01:34.645 ++ SPDK_TEST_FUZZER=1 00:01:34.645 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:34.645 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:01:34.645 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:34.645 ++ RUN_NIGHTLY=1 00:01:34.645 + case $SPDK_TEST_NVMF_NICS in 00:01:34.645 + DRIVERS= 00:01:34.645 + [[ -n '' ]] 00:01:34.645 + exit 0 00:01:34.652 [Pipeline] } 00:01:34.663 [Pipeline] // withEnv 00:01:34.666 [Pipeline] } 00:01:34.675 [Pipeline] // stage 00:01:34.681 [Pipeline] catchError 00:01:34.682 [Pipeline] { 00:01:34.691 [Pipeline] timeout 00:01:34.691 Timeout set to expire in 30 min 00:01:34.692 [Pipeline] { 00:01:34.702 [Pipeline] stage 00:01:34.703 [Pipeline] { (Tests) 00:01:34.713 [Pipeline] sh 00:01:34.991 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:34.991 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:34.991 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:34.991 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:34.991 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:34.991 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:34.991 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:34.991 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:34.991 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:34.991 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:34.991 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:34.991 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:34.991 + source /etc/os-release 00:01:34.991 ++ NAME='Fedora Linux' 00:01:34.991 ++ VERSION='38 (Cloud Edition)' 00:01:34.991 ++ ID=fedora 00:01:34.991 ++ VERSION_ID=38 00:01:34.991 ++ VERSION_CODENAME= 00:01:34.991 ++ PLATFORM_ID=platform:f38 00:01:34.991 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:34.991 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:34.991 ++ LOGO=fedora-logo-icon 00:01:34.991 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:34.991 ++ HOME_URL=https://fedoraproject.org/ 00:01:34.991 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:34.991 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:34.991 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:34.991 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:34.991 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:34.991 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:34.991 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:34.991 ++ SUPPORT_END=2024-05-14 00:01:34.991 ++ VARIANT='Cloud Edition' 00:01:34.991 ++ VARIANT_ID=cloud 00:01:34.991 + uname -a 00:01:34.991 Linux spdk-wfp-39 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 02:47:10 UTC 2024 x86_64 GNU/Linux 00:01:34.991 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:38.279 Hugepages 00:01:38.279 node hugesize free / total 00:01:38.279 node0 1048576kB 0 / 0 00:01:38.279 node0 2048kB 0 / 0 00:01:38.279 node1 1048576kB 0 / 0 00:01:38.279 node1 2048kB 0 / 0 00:01:38.279 00:01:38.279 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:38.279 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:38.279 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:38.279 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:38.279 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:38.279 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:38.279 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:38.279 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:38.279 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:38.538 NVMe 0000:1a:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:01:38.538 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:38.538 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:38.538 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:38.538 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:38.538 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:38.538 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:38.538 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:38.538 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:38.538 + rm -f /tmp/spdk-ld-path 00:01:38.538 + source autorun-spdk.conf 00:01:38.538 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:38.538 ++ SPDK_RUN_UBSAN=1 00:01:38.538 ++ SPDK_TEST_FUZZER=1 00:01:38.538 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:38.538 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:01:38.538 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:38.538 ++ RUN_NIGHTLY=1 00:01:38.538 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:38.538 + [[ -n '' ]] 00:01:38.538 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:38.538 + for M in /var/spdk/build-*-manifest.txt 00:01:38.538 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:38.538 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:38.538 + for M in /var/spdk/build-*-manifest.txt 00:01:38.538 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:38.538 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:38.538 ++ uname 00:01:38.538 + [[ Linux == \L\i\n\u\x ]] 00:01:38.538 + sudo dmesg -T 00:01:38.538 + sudo dmesg --clear 00:01:38.798 + dmesg_pid=3038208 00:01:38.798 + [[ Fedora Linux == FreeBSD ]] 00:01:38.798 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:38.798 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:38.798 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:38.798 + export VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:01:38.798 + VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:01:38.798 + [[ -x /usr/src/fio-static/fio ]] 00:01:38.798 + export FIO_BIN=/usr/src/fio-static/fio 00:01:38.798 + FIO_BIN=/usr/src/fio-static/fio 00:01:38.798 + sudo dmesg -Tw 00:01:38.798 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:38.798 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:38.798 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:38.798 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:38.798 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:38.798 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:38.798 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:38.798 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:38.798 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:38.798 Test configuration: 00:01:38.798 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:38.798 SPDK_RUN_UBSAN=1 00:01:38.798 SPDK_TEST_FUZZER=1 00:01:38.798 SPDK_TEST_FUZZER_SHORT=1 00:01:38.798 SPDK_TEST_NATIVE_DPDK=v23.11 00:01:38.798 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:38.798 RUN_NIGHTLY=1 13:13:57 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:38.798 13:13:57 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:38.798 13:13:57 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:38.798 13:13:57 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:38.798 13:13:57 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:38.798 13:13:57 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:38.798 13:13:57 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:38.798 13:13:57 -- paths/export.sh@5 -- $ export PATH 00:01:38.798 13:13:57 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:38.798 13:13:57 -- common/autobuild_common.sh@437 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:38.798 13:13:57 -- common/autobuild_common.sh@438 -- $ date +%s 00:01:38.798 13:13:57 -- common/autobuild_common.sh@438 -- $ mktemp -dt spdk_1721819637.XXXXXX 00:01:38.798 13:13:57 -- common/autobuild_common.sh@438 -- $ SPDK_WORKSPACE=/tmp/spdk_1721819637.i28vGs 00:01:38.798 13:13:57 -- common/autobuild_common.sh@440 -- $ [[ -n '' ]] 00:01:38.798 13:13:57 -- common/autobuild_common.sh@444 -- $ '[' -n v23.11 ']' 00:01:38.798 13:13:57 -- common/autobuild_common.sh@445 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:38.798 13:13:57 -- common/autobuild_common.sh@445 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:01:38.798 13:13:57 -- common/autobuild_common.sh@451 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:38.798 13:13:57 -- common/autobuild_common.sh@453 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:38.798 13:13:57 -- common/autobuild_common.sh@454 -- $ get_config_params 00:01:38.798 13:13:57 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:01:38.798 13:13:57 -- common/autotest_common.sh@10 -- $ set +x 00:01:38.798 13:13:57 -- common/autobuild_common.sh@454 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:01:38.798 13:13:57 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:38.798 13:13:57 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:38.798 13:13:57 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:38.798 13:13:57 -- spdk/autobuild.sh@16 -- $ date -u 00:01:38.798 Wed Jul 24 11:13:57 AM UTC 2024 00:01:38.798 13:13:57 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:38.798 LTS-60-gdbef7efac 00:01:38.798 13:13:57 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:38.798 13:13:57 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:38.798 13:13:57 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:38.798 13:13:57 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:01:38.798 13:13:57 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:38.798 13:13:57 -- common/autotest_common.sh@10 -- $ set +x 00:01:38.798 ************************************ 00:01:38.798 START TEST ubsan 00:01:38.798 ************************************ 00:01:38.798 13:13:57 -- common/autotest_common.sh@1104 -- $ echo 'using ubsan' 00:01:38.798 using ubsan 00:01:38.798 00:01:38.798 real 0m0.001s 00:01:38.798 user 0m0.000s 00:01:38.798 sys 0m0.001s 00:01:38.798 13:13:57 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:38.798 13:13:57 -- common/autotest_common.sh@10 -- $ set +x 00:01:38.798 ************************************ 00:01:38.798 END TEST ubsan 00:01:38.798 ************************************ 00:01:38.798 13:13:57 -- spdk/autobuild.sh@27 -- $ '[' -n v23.11 ']' 00:01:38.798 13:13:57 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:38.798 13:13:57 -- common/autobuild_common.sh@430 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:38.798 13:13:57 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:01:38.798 13:13:57 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:38.799 13:13:57 -- common/autotest_common.sh@10 -- $ set +x 00:01:38.799 ************************************ 00:01:38.799 START TEST build_native_dpdk 00:01:38.799 ************************************ 00:01:38.799 13:13:57 -- common/autotest_common.sh@1104 -- $ _build_native_dpdk 00:01:38.799 13:13:57 -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:38.799 13:13:57 -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:38.799 13:13:57 -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:38.799 13:13:57 -- common/autobuild_common.sh@51 -- $ local compiler 00:01:38.799 13:13:57 -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:38.799 13:13:57 -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:38.799 13:13:57 -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:38.799 13:13:57 -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:38.799 13:13:57 -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:38.799 13:13:57 -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:38.799 13:13:57 -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:38.799 13:13:57 -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:38.799 13:13:57 -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:38.799 13:13:57 -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:38.799 13:13:57 -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:38.799 13:13:57 -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:39.058 13:13:57 -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:39.058 13:13:57 -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:01:39.058 13:13:57 -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:39.058 13:13:57 -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:01:39.058 eeb0605f11 version: 23.11.0 00:01:39.058 238778122a doc: update release notes for 23.11 00:01:39.058 46aa6b3cfc doc: fix description of RSS features 00:01:39.058 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:01:39.058 7e421ae345 devtools: support skipping forbid rule check 00:01:39.058 13:13:57 -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:39.058 13:13:57 -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:39.058 13:13:57 -- common/autobuild_common.sh@87 -- $ dpdk_ver=23.11.0 00:01:39.058 13:13:57 -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:39.058 13:13:57 -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:39.058 13:13:57 -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:39.058 13:13:57 -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:39.058 13:13:57 -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:39.058 13:13:57 -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:39.058 13:13:57 -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:01:39.058 13:13:57 -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:01:39.058 13:13:57 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:39.058 13:13:57 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:39.058 13:13:57 -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:01:39.058 13:13:57 -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:39.058 13:13:57 -- common/autobuild_common.sh@168 -- $ uname -s 00:01:39.058 13:13:57 -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:01:39.058 13:13:57 -- common/autobuild_common.sh@169 -- $ lt 23.11.0 21.11.0 00:01:39.058 13:13:57 -- scripts/common.sh@372 -- $ cmp_versions 23.11.0 '<' 21.11.0 00:01:39.058 13:13:57 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:39.058 13:13:57 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:39.058 13:13:57 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:39.058 13:13:57 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:39.058 13:13:57 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:39.058 13:13:57 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:39.058 13:13:57 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:39.058 13:13:57 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:39.058 13:13:57 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:39.058 13:13:57 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:39.058 13:13:57 -- scripts/common.sh@343 -- $ case "$op" in 00:01:39.058 13:13:57 -- scripts/common.sh@344 -- $ : 1 00:01:39.058 13:13:57 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:39.058 13:13:57 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:39.058 13:13:57 -- scripts/common.sh@364 -- $ decimal 23 00:01:39.058 13:13:57 -- scripts/common.sh@352 -- $ local d=23 00:01:39.058 13:13:57 -- scripts/common.sh@353 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:01:39.058 13:13:57 -- scripts/common.sh@354 -- $ echo 23 00:01:39.058 13:13:57 -- scripts/common.sh@364 -- $ ver1[v]=23 00:01:39.058 13:13:57 -- scripts/common.sh@365 -- $ decimal 21 00:01:39.058 13:13:57 -- scripts/common.sh@352 -- $ local d=21 00:01:39.058 13:13:57 -- scripts/common.sh@353 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:39.058 13:13:57 -- scripts/common.sh@354 -- $ echo 21 00:01:39.058 13:13:57 -- scripts/common.sh@365 -- $ ver2[v]=21 00:01:39.058 13:13:57 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:39.058 13:13:57 -- scripts/common.sh@366 -- $ return 1 00:01:39.058 13:13:57 -- common/autobuild_common.sh@173 -- $ patch -p1 00:01:39.058 patching file config/rte_config.h 00:01:39.058 Hunk #1 succeeded at 60 (offset 1 line). 00:01:39.058 13:13:57 -- common/autobuild_common.sh@176 -- $ lt 23.11.0 24.07.0 00:01:39.058 13:13:57 -- scripts/common.sh@372 -- $ cmp_versions 23.11.0 '<' 24.07.0 00:01:39.058 13:13:57 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:39.058 13:13:57 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:39.058 13:13:57 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:39.058 13:13:57 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:39.058 13:13:57 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:39.058 13:13:57 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:39.058 13:13:57 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:39.058 13:13:57 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:39.058 13:13:57 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:39.058 13:13:57 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:39.058 13:13:57 -- scripts/common.sh@343 -- $ case "$op" in 00:01:39.058 13:13:57 -- scripts/common.sh@344 -- $ : 1 00:01:39.058 13:13:57 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:39.058 13:13:57 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:39.058 13:13:57 -- scripts/common.sh@364 -- $ decimal 23 00:01:39.058 13:13:57 -- scripts/common.sh@352 -- $ local d=23 00:01:39.058 13:13:57 -- scripts/common.sh@353 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:01:39.058 13:13:57 -- scripts/common.sh@354 -- $ echo 23 00:01:39.059 13:13:57 -- scripts/common.sh@364 -- $ ver1[v]=23 00:01:39.059 13:13:57 -- scripts/common.sh@365 -- $ decimal 24 00:01:39.059 13:13:57 -- scripts/common.sh@352 -- $ local d=24 00:01:39.059 13:13:57 -- scripts/common.sh@353 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:39.059 13:13:57 -- scripts/common.sh@354 -- $ echo 24 00:01:39.059 13:13:57 -- scripts/common.sh@365 -- $ ver2[v]=24 00:01:39.059 13:13:57 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:39.059 13:13:57 -- scripts/common.sh@367 -- $ (( ver1[v] < ver2[v] )) 00:01:39.059 13:13:57 -- scripts/common.sh@367 -- $ return 0 00:01:39.059 13:13:57 -- common/autobuild_common.sh@177 -- $ patch -p1 00:01:39.059 patching file lib/pcapng/rte_pcapng.c 00:01:39.059 13:13:57 -- common/autobuild_common.sh@180 -- $ dpdk_kmods=false 00:01:39.059 13:13:57 -- common/autobuild_common.sh@181 -- $ uname -s 00:01:39.059 13:13:57 -- common/autobuild_common.sh@181 -- $ '[' Linux = FreeBSD ']' 00:01:39.059 13:13:57 -- common/autobuild_common.sh@185 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:01:39.059 13:13:57 -- common/autobuild_common.sh@185 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:44.333 The Meson build system 00:01:44.333 Version: 1.3.1 00:01:44.333 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:44.333 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:01:44.333 Build type: native build 00:01:44.333 Program cat found: YES (/usr/bin/cat) 00:01:44.333 Project name: DPDK 00:01:44.333 Project version: 23.11.0 00:01:44.333 C compiler for the host machine: gcc (gcc 13.2.1 "gcc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:01:44.333 C linker for the host machine: gcc ld.bfd 2.39-16 00:01:44.333 Host machine cpu family: x86_64 00:01:44.333 Host machine cpu: x86_64 00:01:44.333 Message: ## Building in Developer Mode ## 00:01:44.333 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:44.333 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:01:44.333 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:01:44.333 Program python3 found: YES (/usr/bin/python3) 00:01:44.333 Program cat found: YES (/usr/bin/cat) 00:01:44.333 config/meson.build:113: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:01:44.333 Compiler for C supports arguments -march=native: YES 00:01:44.333 Checking for size of "void *" : 8 00:01:44.333 Checking for size of "void *" : 8 (cached) 00:01:44.333 Library m found: YES 00:01:44.333 Library numa found: YES 00:01:44.333 Has header "numaif.h" : YES 00:01:44.334 Library fdt found: NO 00:01:44.334 Library execinfo found: NO 00:01:44.334 Has header "execinfo.h" : YES 00:01:44.334 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:44.334 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:44.334 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:44.334 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:44.334 Run-time dependency openssl found: YES 3.0.9 00:01:44.334 Run-time dependency libpcap found: YES 1.10.4 00:01:44.334 Has header "pcap.h" with dependency libpcap: YES 00:01:44.334 Compiler for C supports arguments -Wcast-qual: YES 00:01:44.334 Compiler for C supports arguments -Wdeprecated: YES 00:01:44.334 Compiler for C supports arguments -Wformat: YES 00:01:44.334 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:44.334 Compiler for C supports arguments -Wformat-security: NO 00:01:44.334 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:44.334 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:44.334 Compiler for C supports arguments -Wnested-externs: YES 00:01:44.334 Compiler for C supports arguments -Wold-style-definition: YES 00:01:44.334 Compiler for C supports arguments -Wpointer-arith: YES 00:01:44.334 Compiler for C supports arguments -Wsign-compare: YES 00:01:44.334 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:44.334 Compiler for C supports arguments -Wundef: YES 00:01:44.334 Compiler for C supports arguments -Wwrite-strings: YES 00:01:44.334 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:44.334 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:44.334 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:44.334 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:44.334 Program objdump found: YES (/usr/bin/objdump) 00:01:44.334 Compiler for C supports arguments -mavx512f: YES 00:01:44.334 Checking if "AVX512 checking" compiles: YES 00:01:44.334 Fetching value of define "__SSE4_2__" : 1 00:01:44.334 Fetching value of define "__AES__" : 1 00:01:44.334 Fetching value of define "__AVX__" : 1 00:01:44.334 Fetching value of define "__AVX2__" : 1 00:01:44.334 Fetching value of define "__AVX512BW__" : 1 00:01:44.334 Fetching value of define "__AVX512CD__" : 1 00:01:44.334 Fetching value of define "__AVX512DQ__" : 1 00:01:44.334 Fetching value of define "__AVX512F__" : 1 00:01:44.334 Fetching value of define "__AVX512VL__" : 1 00:01:44.334 Fetching value of define "__PCLMUL__" : 1 00:01:44.334 Fetching value of define "__RDRND__" : 1 00:01:44.334 Fetching value of define "__RDSEED__" : 1 00:01:44.334 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:44.334 Fetching value of define "__znver1__" : (undefined) 00:01:44.334 Fetching value of define "__znver2__" : (undefined) 00:01:44.334 Fetching value of define "__znver3__" : (undefined) 00:01:44.334 Fetching value of define "__znver4__" : (undefined) 00:01:44.334 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:44.334 Message: lib/log: Defining dependency "log" 00:01:44.334 Message: lib/kvargs: Defining dependency "kvargs" 00:01:44.334 Message: lib/telemetry: Defining dependency "telemetry" 00:01:44.334 Checking for function "getentropy" : NO 00:01:44.334 Message: lib/eal: Defining dependency "eal" 00:01:44.334 Message: lib/ring: Defining dependency "ring" 00:01:44.334 Message: lib/rcu: Defining dependency "rcu" 00:01:44.334 Message: lib/mempool: Defining dependency "mempool" 00:01:44.334 Message: lib/mbuf: Defining dependency "mbuf" 00:01:44.334 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:44.334 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:44.334 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:44.334 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:44.334 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:44.334 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:44.334 Compiler for C supports arguments -mpclmul: YES 00:01:44.334 Compiler for C supports arguments -maes: YES 00:01:44.334 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:44.334 Compiler for C supports arguments -mavx512bw: YES 00:01:44.334 Compiler for C supports arguments -mavx512dq: YES 00:01:44.334 Compiler for C supports arguments -mavx512vl: YES 00:01:44.334 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:44.334 Compiler for C supports arguments -mavx2: YES 00:01:44.334 Compiler for C supports arguments -mavx: YES 00:01:44.334 Message: lib/net: Defining dependency "net" 00:01:44.334 Message: lib/meter: Defining dependency "meter" 00:01:44.334 Message: lib/ethdev: Defining dependency "ethdev" 00:01:44.334 Message: lib/pci: Defining dependency "pci" 00:01:44.334 Message: lib/cmdline: Defining dependency "cmdline" 00:01:44.334 Message: lib/metrics: Defining dependency "metrics" 00:01:44.334 Message: lib/hash: Defining dependency "hash" 00:01:44.334 Message: lib/timer: Defining dependency "timer" 00:01:44.334 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:44.334 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:44.334 Fetching value of define "__AVX512CD__" : 1 (cached) 00:01:44.334 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:44.334 Message: lib/acl: Defining dependency "acl" 00:01:44.334 Message: lib/bbdev: Defining dependency "bbdev" 00:01:44.334 Message: lib/bitratestats: Defining dependency "bitratestats" 00:01:44.334 Run-time dependency libelf found: YES 0.190 00:01:44.334 Message: lib/bpf: Defining dependency "bpf" 00:01:44.334 Message: lib/cfgfile: Defining dependency "cfgfile" 00:01:44.334 Message: lib/compressdev: Defining dependency "compressdev" 00:01:44.334 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:44.334 Message: lib/distributor: Defining dependency "distributor" 00:01:44.334 Message: lib/dmadev: Defining dependency "dmadev" 00:01:44.334 Message: lib/efd: Defining dependency "efd" 00:01:44.334 Message: lib/eventdev: Defining dependency "eventdev" 00:01:44.334 Message: lib/dispatcher: Defining dependency "dispatcher" 00:01:44.334 Message: lib/gpudev: Defining dependency "gpudev" 00:01:44.334 Message: lib/gro: Defining dependency "gro" 00:01:44.334 Message: lib/gso: Defining dependency "gso" 00:01:44.334 Message: lib/ip_frag: Defining dependency "ip_frag" 00:01:44.334 Message: lib/jobstats: Defining dependency "jobstats" 00:01:44.334 Message: lib/latencystats: Defining dependency "latencystats" 00:01:44.334 Message: lib/lpm: Defining dependency "lpm" 00:01:44.334 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:44.335 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:44.335 Fetching value of define "__AVX512IFMA__" : (undefined) 00:01:44.335 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:01:44.335 Message: lib/member: Defining dependency "member" 00:01:44.335 Message: lib/pcapng: Defining dependency "pcapng" 00:01:44.335 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:44.335 Message: lib/power: Defining dependency "power" 00:01:44.335 Message: lib/rawdev: Defining dependency "rawdev" 00:01:44.335 Message: lib/regexdev: Defining dependency "regexdev" 00:01:44.335 Message: lib/mldev: Defining dependency "mldev" 00:01:44.335 Message: lib/rib: Defining dependency "rib" 00:01:44.335 Message: lib/reorder: Defining dependency "reorder" 00:01:44.335 Message: lib/sched: Defining dependency "sched" 00:01:44.335 Message: lib/security: Defining dependency "security" 00:01:44.335 Message: lib/stack: Defining dependency "stack" 00:01:44.335 Has header "linux/userfaultfd.h" : YES 00:01:44.335 Has header "linux/vduse.h" : YES 00:01:44.335 Message: lib/vhost: Defining dependency "vhost" 00:01:44.335 Message: lib/ipsec: Defining dependency "ipsec" 00:01:44.335 Message: lib/pdcp: Defining dependency "pdcp" 00:01:44.335 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:44.335 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:44.335 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:44.335 Message: lib/fib: Defining dependency "fib" 00:01:44.335 Message: lib/port: Defining dependency "port" 00:01:44.335 Message: lib/pdump: Defining dependency "pdump" 00:01:44.335 Message: lib/table: Defining dependency "table" 00:01:44.335 Message: lib/pipeline: Defining dependency "pipeline" 00:01:44.335 Message: lib/graph: Defining dependency "graph" 00:01:44.335 Message: lib/node: Defining dependency "node" 00:01:44.335 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:46.247 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:46.247 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:46.247 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:46.247 Compiler for C supports arguments -Wno-sign-compare: YES 00:01:46.247 Compiler for C supports arguments -Wno-unused-value: YES 00:01:46.247 Compiler for C supports arguments -Wno-format: YES 00:01:46.247 Compiler for C supports arguments -Wno-format-security: YES 00:01:46.247 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:01:46.247 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:01:46.247 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:01:46.247 Compiler for C supports arguments -Wno-unused-parameter: YES 00:01:46.247 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:46.247 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:46.247 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:46.247 Compiler for C supports arguments -mavx512bw: YES (cached) 00:01:46.247 Compiler for C supports arguments -march=skylake-avx512: YES 00:01:46.247 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:01:46.247 Has header "sys/epoll.h" : YES 00:01:46.247 Program doxygen found: YES (/usr/bin/doxygen) 00:01:46.247 Configuring doxy-api-html.conf using configuration 00:01:46.247 Configuring doxy-api-man.conf using configuration 00:01:46.247 Program mandb found: YES (/usr/bin/mandb) 00:01:46.247 Program sphinx-build found: NO 00:01:46.247 Configuring rte_build_config.h using configuration 00:01:46.247 Message: 00:01:46.247 ================= 00:01:46.247 Applications Enabled 00:01:46.247 ================= 00:01:46.247 00:01:46.247 apps: 00:01:46.247 dumpcap, graph, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, 00:01:46.247 test-crypto-perf, test-dma-perf, test-eventdev, test-fib, test-flow-perf, test-gpudev, test-mldev, test-pipeline, 00:01:46.247 test-pmd, test-regex, test-sad, test-security-perf, 00:01:46.247 00:01:46.247 Message: 00:01:46.247 ================= 00:01:46.247 Libraries Enabled 00:01:46.247 ================= 00:01:46.247 00:01:46.247 libs: 00:01:46.247 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:46.247 net, meter, ethdev, pci, cmdline, metrics, hash, timer, 00:01:46.247 acl, bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, 00:01:46.247 dmadev, efd, eventdev, dispatcher, gpudev, gro, gso, ip_frag, 00:01:46.247 jobstats, latencystats, lpm, member, pcapng, power, rawdev, regexdev, 00:01:46.247 mldev, rib, reorder, sched, security, stack, vhost, ipsec, 00:01:46.247 pdcp, fib, port, pdump, table, pipeline, graph, node, 00:01:46.247 00:01:46.247 00:01:46.247 Message: 00:01:46.247 =============== 00:01:46.247 Drivers Enabled 00:01:46.247 =============== 00:01:46.247 00:01:46.247 common: 00:01:46.247 00:01:46.247 bus: 00:01:46.248 pci, vdev, 00:01:46.248 mempool: 00:01:46.248 ring, 00:01:46.248 dma: 00:01:46.248 00:01:46.248 net: 00:01:46.248 i40e, 00:01:46.248 raw: 00:01:46.248 00:01:46.248 crypto: 00:01:46.248 00:01:46.248 compress: 00:01:46.248 00:01:46.248 regex: 00:01:46.248 00:01:46.248 ml: 00:01:46.248 00:01:46.248 vdpa: 00:01:46.248 00:01:46.248 event: 00:01:46.248 00:01:46.248 baseband: 00:01:46.248 00:01:46.248 gpu: 00:01:46.248 00:01:46.248 00:01:46.248 Message: 00:01:46.248 ================= 00:01:46.248 Content Skipped 00:01:46.248 ================= 00:01:46.248 00:01:46.248 apps: 00:01:46.248 00:01:46.248 libs: 00:01:46.248 00:01:46.248 drivers: 00:01:46.248 common/cpt: not in enabled drivers build config 00:01:46.248 common/dpaax: not in enabled drivers build config 00:01:46.248 common/iavf: not in enabled drivers build config 00:01:46.248 common/idpf: not in enabled drivers build config 00:01:46.248 common/mvep: not in enabled drivers build config 00:01:46.248 common/octeontx: not in enabled drivers build config 00:01:46.248 bus/auxiliary: not in enabled drivers build config 00:01:46.248 bus/cdx: not in enabled drivers build config 00:01:46.248 bus/dpaa: not in enabled drivers build config 00:01:46.248 bus/fslmc: not in enabled drivers build config 00:01:46.248 bus/ifpga: not in enabled drivers build config 00:01:46.248 bus/platform: not in enabled drivers build config 00:01:46.248 bus/vmbus: not in enabled drivers build config 00:01:46.248 common/cnxk: not in enabled drivers build config 00:01:46.248 common/mlx5: not in enabled drivers build config 00:01:46.248 common/nfp: not in enabled drivers build config 00:01:46.248 common/qat: not in enabled drivers build config 00:01:46.248 common/sfc_efx: not in enabled drivers build config 00:01:46.248 mempool/bucket: not in enabled drivers build config 00:01:46.248 mempool/cnxk: not in enabled drivers build config 00:01:46.248 mempool/dpaa: not in enabled drivers build config 00:01:46.248 mempool/dpaa2: not in enabled drivers build config 00:01:46.248 mempool/octeontx: not in enabled drivers build config 00:01:46.248 mempool/stack: not in enabled drivers build config 00:01:46.248 dma/cnxk: not in enabled drivers build config 00:01:46.248 dma/dpaa: not in enabled drivers build config 00:01:46.248 dma/dpaa2: not in enabled drivers build config 00:01:46.248 dma/hisilicon: not in enabled drivers build config 00:01:46.248 dma/idxd: not in enabled drivers build config 00:01:46.248 dma/ioat: not in enabled drivers build config 00:01:46.248 dma/skeleton: not in enabled drivers build config 00:01:46.248 net/af_packet: not in enabled drivers build config 00:01:46.248 net/af_xdp: not in enabled drivers build config 00:01:46.248 net/ark: not in enabled drivers build config 00:01:46.248 net/atlantic: not in enabled drivers build config 00:01:46.248 net/avp: not in enabled drivers build config 00:01:46.248 net/axgbe: not in enabled drivers build config 00:01:46.248 net/bnx2x: not in enabled drivers build config 00:01:46.248 net/bnxt: not in enabled drivers build config 00:01:46.248 net/bonding: not in enabled drivers build config 00:01:46.248 net/cnxk: not in enabled drivers build config 00:01:46.248 net/cpfl: not in enabled drivers build config 00:01:46.248 net/cxgbe: not in enabled drivers build config 00:01:46.248 net/dpaa: not in enabled drivers build config 00:01:46.248 net/dpaa2: not in enabled drivers build config 00:01:46.248 net/e1000: not in enabled drivers build config 00:01:46.248 net/ena: not in enabled drivers build config 00:01:46.248 net/enetc: not in enabled drivers build config 00:01:46.248 net/enetfec: not in enabled drivers build config 00:01:46.248 net/enic: not in enabled drivers build config 00:01:46.248 net/failsafe: not in enabled drivers build config 00:01:46.248 net/fm10k: not in enabled drivers build config 00:01:46.248 net/gve: not in enabled drivers build config 00:01:46.248 net/hinic: not in enabled drivers build config 00:01:46.248 net/hns3: not in enabled drivers build config 00:01:46.248 net/iavf: not in enabled drivers build config 00:01:46.248 net/ice: not in enabled drivers build config 00:01:46.248 net/idpf: not in enabled drivers build config 00:01:46.248 net/igc: not in enabled drivers build config 00:01:46.248 net/ionic: not in enabled drivers build config 00:01:46.248 net/ipn3ke: not in enabled drivers build config 00:01:46.248 net/ixgbe: not in enabled drivers build config 00:01:46.248 net/mana: not in enabled drivers build config 00:01:46.248 net/memif: not in enabled drivers build config 00:01:46.248 net/mlx4: not in enabled drivers build config 00:01:46.248 net/mlx5: not in enabled drivers build config 00:01:46.248 net/mvneta: not in enabled drivers build config 00:01:46.248 net/mvpp2: not in enabled drivers build config 00:01:46.248 net/netvsc: not in enabled drivers build config 00:01:46.248 net/nfb: not in enabled drivers build config 00:01:46.248 net/nfp: not in enabled drivers build config 00:01:46.248 net/ngbe: not in enabled drivers build config 00:01:46.248 net/null: not in enabled drivers build config 00:01:46.248 net/octeontx: not in enabled drivers build config 00:01:46.248 net/octeon_ep: not in enabled drivers build config 00:01:46.248 net/pcap: not in enabled drivers build config 00:01:46.248 net/pfe: not in enabled drivers build config 00:01:46.248 net/qede: not in enabled drivers build config 00:01:46.248 net/ring: not in enabled drivers build config 00:01:46.248 net/sfc: not in enabled drivers build config 00:01:46.248 net/softnic: not in enabled drivers build config 00:01:46.248 net/tap: not in enabled drivers build config 00:01:46.248 net/thunderx: not in enabled drivers build config 00:01:46.248 net/txgbe: not in enabled drivers build config 00:01:46.248 net/vdev_netvsc: not in enabled drivers build config 00:01:46.248 net/vhost: not in enabled drivers build config 00:01:46.248 net/virtio: not in enabled drivers build config 00:01:46.248 net/vmxnet3: not in enabled drivers build config 00:01:46.248 raw/cnxk_bphy: not in enabled drivers build config 00:01:46.248 raw/cnxk_gpio: not in enabled drivers build config 00:01:46.248 raw/dpaa2_cmdif: not in enabled drivers build config 00:01:46.248 raw/ifpga: not in enabled drivers build config 00:01:46.248 raw/ntb: not in enabled drivers build config 00:01:46.248 raw/skeleton: not in enabled drivers build config 00:01:46.248 crypto/armv8: not in enabled drivers build config 00:01:46.248 crypto/bcmfs: not in enabled drivers build config 00:01:46.248 crypto/caam_jr: not in enabled drivers build config 00:01:46.248 crypto/ccp: not in enabled drivers build config 00:01:46.248 crypto/cnxk: not in enabled drivers build config 00:01:46.248 crypto/dpaa_sec: not in enabled drivers build config 00:01:46.248 crypto/dpaa2_sec: not in enabled drivers build config 00:01:46.248 crypto/ipsec_mb: not in enabled drivers build config 00:01:46.248 crypto/mlx5: not in enabled drivers build config 00:01:46.248 crypto/mvsam: not in enabled drivers build config 00:01:46.248 crypto/nitrox: not in enabled drivers build config 00:01:46.248 crypto/null: not in enabled drivers build config 00:01:46.248 crypto/octeontx: not in enabled drivers build config 00:01:46.248 crypto/openssl: not in enabled drivers build config 00:01:46.248 crypto/scheduler: not in enabled drivers build config 00:01:46.248 crypto/uadk: not in enabled drivers build config 00:01:46.248 crypto/virtio: not in enabled drivers build config 00:01:46.248 compress/isal: not in enabled drivers build config 00:01:46.248 compress/mlx5: not in enabled drivers build config 00:01:46.248 compress/octeontx: not in enabled drivers build config 00:01:46.248 compress/zlib: not in enabled drivers build config 00:01:46.248 regex/mlx5: not in enabled drivers build config 00:01:46.248 regex/cn9k: not in enabled drivers build config 00:01:46.248 ml/cnxk: not in enabled drivers build config 00:01:46.248 vdpa/ifc: not in enabled drivers build config 00:01:46.248 vdpa/mlx5: not in enabled drivers build config 00:01:46.248 vdpa/nfp: not in enabled drivers build config 00:01:46.248 vdpa/sfc: not in enabled drivers build config 00:01:46.248 event/cnxk: not in enabled drivers build config 00:01:46.248 event/dlb2: not in enabled drivers build config 00:01:46.248 event/dpaa: not in enabled drivers build config 00:01:46.248 event/dpaa2: not in enabled drivers build config 00:01:46.248 event/dsw: not in enabled drivers build config 00:01:46.248 event/opdl: not in enabled drivers build config 00:01:46.248 event/skeleton: not in enabled drivers build config 00:01:46.248 event/sw: not in enabled drivers build config 00:01:46.248 event/octeontx: not in enabled drivers build config 00:01:46.248 baseband/acc: not in enabled drivers build config 00:01:46.248 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:01:46.248 baseband/fpga_lte_fec: not in enabled drivers build config 00:01:46.248 baseband/la12xx: not in enabled drivers build config 00:01:46.248 baseband/null: not in enabled drivers build config 00:01:46.248 baseband/turbo_sw: not in enabled drivers build config 00:01:46.248 gpu/cuda: not in enabled drivers build config 00:01:46.248 00:01:46.248 00:01:46.248 Build targets in project: 217 00:01:46.248 00:01:46.248 DPDK 23.11.0 00:01:46.248 00:01:46.248 User defined options 00:01:46.248 libdir : lib 00:01:46.248 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:46.248 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:01:46.248 c_link_args : 00:01:46.248 enable_docs : false 00:01:46.248 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:46.248 enable_kmods : false 00:01:46.248 machine : native 00:01:46.248 tests : false 00:01:46.248 00:01:46.248 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:46.248 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:01:46.248 13:14:04 -- common/autobuild_common.sh@189 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j72 00:01:46.248 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:46.249 [1/707] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:46.249 [2/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:46.249 [3/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:46.249 [4/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:46.249 [5/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:46.249 [6/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:46.249 [7/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:46.249 [8/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:46.249 [9/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:46.249 [10/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:46.249 [11/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:46.249 [12/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:46.249 [13/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:46.249 [14/707] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:46.249 [15/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:46.249 [16/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:46.249 [17/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:46.249 [18/707] Linking static target lib/librte_kvargs.a 00:01:46.249 [19/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:46.249 [20/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:46.249 [21/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:46.249 [22/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:46.249 [23/707] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:46.513 [24/707] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:46.513 [25/707] Linking static target lib/librte_log.a 00:01:46.773 [26/707] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.773 [27/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:46.773 [28/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:46.773 [29/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:46.773 [30/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:46.773 [31/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:46.773 [32/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:46.773 [33/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:46.773 [34/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:46.773 [35/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:46.773 [36/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:46.773 [37/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:46.773 [38/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:46.773 [39/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:46.773 [40/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:46.773 [41/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:46.773 [42/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:47.035 [43/707] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:47.035 [44/707] Linking static target lib/librte_meter.a 00:01:47.035 [45/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:47.035 [46/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:47.035 [47/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:47.035 [48/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:47.035 [49/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:47.035 [50/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:47.035 [51/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:47.035 [52/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:47.035 [53/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:47.035 [54/707] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:47.035 [55/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:47.035 [56/707] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:47.035 [57/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:47.035 [58/707] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:47.035 [59/707] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:47.035 [60/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:47.035 [61/707] Linking static target lib/librte_pci.a 00:01:47.035 [62/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:47.035 [63/707] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:47.035 [64/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:47.035 [65/707] Linking static target lib/librte_ring.a 00:01:47.035 [66/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:47.035 [67/707] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:47.035 [68/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:47.035 [69/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:47.035 [70/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:47.035 [71/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:47.035 [72/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:47.035 [73/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:47.035 [74/707] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:47.035 [75/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:47.035 [76/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:47.035 [77/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:47.035 [78/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:47.035 [79/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:47.035 [80/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:47.293 [81/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:47.293 [82/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:47.293 [83/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:47.293 [84/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:47.293 [85/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:47.293 [86/707] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:47.293 [87/707] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:47.293 [88/707] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:47.294 [89/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:47.294 [90/707] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.294 [91/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:47.294 [92/707] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.294 [93/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:47.294 [94/707] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:47.294 [95/707] Linking target lib/librte_log.so.24.0 00:01:47.294 [96/707] Linking static target lib/librte_net.a 00:01:47.294 [97/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:47.294 [98/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:47.559 [99/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:47.559 [100/707] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.559 [101/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:47.559 [102/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:47.559 [103/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:47.559 [104/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:47.559 [105/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:47.559 [106/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:47.559 [107/707] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.559 [108/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:47.559 [109/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:47.559 [110/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:47.559 [111/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:47.559 [112/707] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:01:47.560 [113/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:47.560 [114/707] Linking target lib/librte_kvargs.so.24.0 00:01:47.560 [115/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:01:47.560 [116/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:47.560 [117/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:47.821 [118/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:47.821 [119/707] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:01:47.821 [120/707] Linking static target lib/librte_cfgfile.a 00:01:47.821 [121/707] Linking static target lib/librte_cmdline.a 00:01:47.821 [122/707] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:01:47.821 [123/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:47.821 [124/707] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.821 [125/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:47.821 [126/707] Linking static target lib/librte_mempool.a 00:01:47.821 [127/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:01:47.821 [128/707] Linking static target lib/librte_metrics.a 00:01:47.821 [129/707] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:47.821 [130/707] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:01:47.821 [131/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:47.821 [132/707] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:47.821 [133/707] Linking static target lib/librte_rcu.a 00:01:47.821 [134/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:47.821 [135/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:01:47.821 [136/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:01:47.821 [137/707] Linking static target lib/librte_eal.a 00:01:47.821 [138/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:01:47.821 [139/707] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:01:47.821 [140/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:01:47.821 [141/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:48.082 [142/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:48.082 [143/707] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:01:48.082 [144/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:48.082 [145/707] Linking static target lib/librte_bitratestats.a 00:01:48.082 [146/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:01:48.082 [147/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:48.082 [148/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:48.082 [149/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:48.083 [150/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:01:48.083 [151/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:01:48.083 [152/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:48.343 [153/707] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:01:48.343 [154/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:01:48.343 [155/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:01:48.343 [156/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:48.343 [157/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:48.343 [158/707] Linking static target lib/librte_compressdev.a 00:01:48.343 [159/707] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.343 [160/707] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.344 [161/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:48.344 [162/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:01:48.344 [163/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:01:48.344 [164/707] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.344 [165/707] Linking static target lib/librte_telemetry.a 00:01:48.344 [166/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:01:48.344 [167/707] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:48.344 [168/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:48.344 [169/707] Linking static target lib/librte_timer.a 00:01:48.344 [170/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:48.344 [171/707] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:01:48.344 [172/707] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:48.604 [173/707] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:01:48.604 [174/707] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:01:48.604 [175/707] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:48.604 [176/707] Linking static target lib/librte_bbdev.a 00:01:48.604 [177/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:01:48.604 [178/707] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.604 [179/707] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:48.604 [180/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:48.604 [181/707] Linking static target lib/librte_mbuf.a 00:01:48.604 [182/707] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:48.604 [183/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:01:48.604 [184/707] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:01:48.604 [185/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:01:48.604 [186/707] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:01:48.604 [187/707] Compiling C object lib/librte_dispatcher.a.p/dispatcher_rte_dispatcher.c.o 00:01:48.604 [188/707] Linking static target lib/librte_jobstats.a 00:01:48.604 [189/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp6.c.o 00:01:48.604 [190/707] Linking static target lib/librte_dispatcher.a 00:01:48.604 [191/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:01:48.863 [192/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:01:48.863 [193/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:01:48.863 [194/707] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:01:48.863 [195/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:48.863 [196/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:01:48.863 [197/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:01:48.863 [198/707] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:01:48.863 [199/707] Linking static target lib/librte_distributor.a 00:01:48.863 [200/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_dma_adapter.c.o 00:01:48.863 [201/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:01:48.863 [202/707] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.863 [203/707] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:01:48.863 [204/707] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:01:48.863 [205/707] Linking static target lib/librte_gpudev.a 00:01:48.863 [206/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:48.863 [207/707] Linking static target lib/librte_dmadev.a 00:01:48.863 [208/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:01:48.863 [209/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:01:49.131 [210/707] Linking static target lib/librte_gro.a 00:01:49.131 [211/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:49.131 [212/707] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:01:49.131 [213/707] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.131 [214/707] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.131 [215/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev_pmd.c.o 00:01:49.131 [216/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:01:49.131 [217/707] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:01:49.131 [218/707] Linking static target lib/librte_gso.a 00:01:49.131 [219/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:01:49.131 [220/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:01:49.131 [221/707] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.131 [222/707] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:49.131 [223/707] Linking target lib/librte_telemetry.so.24.0 00:01:49.131 [224/707] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.131 [225/707] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:49.131 [226/707] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:01:49.131 [227/707] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:49.131 [228/707] Linking static target lib/librte_latencystats.a 00:01:49.131 [229/707] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:01:49.131 [230/707] Linking static target lib/member/libsketch_avx512_tmp.a 00:01:49.131 [231/707] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.131 [232/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils.c.o 00:01:49.399 [233/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:01:49.399 [234/707] Linking static target lib/librte_bpf.a 00:01:49.399 [235/707] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.399 [236/707] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:49.399 [237/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:01:49.399 [238/707] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:01:49.399 [239/707] Linking static target lib/librte_ip_frag.a 00:01:49.399 [240/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:01:49.399 [241/707] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:49.399 [242/707] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:01:49.399 [243/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:01:49.399 [244/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:01:49.399 [245/707] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:01:49.399 [246/707] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.399 [247/707] Linking static target lib/librte_rawdev.a 00:01:49.399 [248/707] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:49.399 [249/707] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.399 [250/707] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:49.399 [251/707] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:01:49.399 [252/707] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.399 [253/707] Generating lib/dispatcher.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.399 [254/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar_bfloat16.c.o 00:01:49.658 [255/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:01:49.658 [256/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:01:49.658 [257/707] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.658 [258/707] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.658 [259/707] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:49.658 [260/707] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:01:49.658 [261/707] Linking static target lib/librte_stack.a 00:01:49.658 [262/707] Linking static target lib/librte_regexdev.a 00:01:49.658 [263/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev.c.o 00:01:49.658 [264/707] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.658 [265/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar.c.o 00:01:49.658 [266/707] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:49.658 [267/707] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:01:49.658 [268/707] Linking static target lib/librte_mldev.a 00:01:49.658 [269/707] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:49.658 [270/707] Linking static target lib/librte_power.a 00:01:49.658 [271/707] Linking static target lib/librte_pcapng.a 00:01:49.658 [272/707] Linking static target lib/librte_reorder.a 00:01:49.658 [273/707] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.658 [274/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:01:49.918 [275/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:01:49.918 [276/707] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.918 [277/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:01:49.918 [278/707] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.918 [279/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:01:49.918 [280/707] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:49.918 [281/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:01:49.918 [282/707] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:01:49.918 [283/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:01:49.918 [284/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:01:49.918 [285/707] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:49.918 [286/707] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:49.918 [287/707] Linking static target lib/librte_efd.a 00:01:49.918 [288/707] Linking static target lib/librte_security.a 00:01:49.918 [289/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_reorder.c.o 00:01:49.918 [290/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_crypto.c.o 00:01:49.918 [291/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:01:50.183 [292/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_ctrl_pdu.c.o 00:01:50.183 [293/707] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:01:50.183 [294/707] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:01:50.183 [295/707] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:01:50.183 [296/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_cnt.c.o 00:01:50.183 [297/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:01:50.183 [298/707] Linking static target lib/librte_lpm.a 00:01:50.183 [299/707] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.183 [300/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:01:50.183 [301/707] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.183 [302/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:50.183 [303/707] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:01:50.183 [304/707] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:50.183 [305/707] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.183 [306/707] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:01:50.183 [307/707] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:50.444 [308/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:50.444 [309/707] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.444 [310/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:01:50.444 [311/707] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:01:50.444 [312/707] Linking static target lib/librte_rib.a 00:01:50.444 [313/707] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:01:50.444 [314/707] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.444 [315/707] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:01:50.444 [316/707] Compiling C object lib/librte_pdcp.a.p/pdcp_rte_pdcp.c.o 00:01:50.706 [317/707] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:01:50.706 [318/707] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.706 [319/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:01:50.706 [320/707] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.706 [321/707] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:01:50.706 [322/707] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:01:50.970 [323/707] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.970 [324/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:01:50.970 [325/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:01:50.970 [326/707] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:01:50.970 [327/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:01:50.970 [328/707] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.970 [329/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:01:50.970 [330/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:01:50.970 [331/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:01:50.970 [332/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:01:50.970 [333/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:50.970 [334/707] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:01:50.970 [335/707] Compiling C object lib/librte_node.a.p/node_null.c.o 00:01:50.970 [336/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:01:50.970 [337/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:01:50.970 [338/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:01:50.970 [339/707] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:01:50.970 [340/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:50.970 [341/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:50.970 [342/707] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:01:50.970 [343/707] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:01:50.970 [344/707] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:01:51.233 [345/707] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:01:51.233 [346/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:01:51.233 [347/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_worker.c.o 00:01:51.233 [348/707] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:01:51.233 [349/707] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.233 [350/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:01:51.233 [351/707] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:01:51.233 [352/707] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:01:51.233 [353/707] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:01:51.233 [354/707] Compiling C object lib/librte_graph.a.p/graph_graph_pcap.c.o 00:01:51.233 [355/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:01:51.233 [356/707] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:01:51.233 [357/707] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:01:51.233 [358/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:51.233 [359/707] Linking static target lib/librte_pdump.a 00:01:51.233 [360/707] Linking static target lib/librte_cryptodev.a 00:01:51.495 [361/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:01:51.495 [362/707] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:01:51.495 [363/707] Linking static target lib/librte_fib.a 00:01:51.495 [364/707] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:01:51.495 [365/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:01:51.495 [366/707] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:01:51.495 [367/707] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:01:51.495 [368/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:51.495 [369/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:51.495 [370/707] Compiling C object lib/librte_node.a.p/node_log.c.o 00:01:51.755 [371/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:01:51.755 [372/707] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:01:51.755 [373/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:01:51.755 [374/707] Compiling C object lib/librte_node.a.p/node_kernel_tx.c.o 00:01:51.755 [375/707] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.755 [376/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:51.755 [377/707] Compiling C object lib/librte_node.a.p/node_ip4_reassembly.c.o 00:01:51.755 [378/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:01:51.755 [379/707] Compiling C object lib/librte_node.a.p/node_ip4_local.c.o 00:01:52.018 [380/707] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:01:52.018 [381/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:01:52.018 [382/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:52.018 [383/707] Linking static target lib/librte_sched.a 00:01:52.018 [384/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:52.018 [385/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:01:52.018 [386/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_model_mcore_dispatch.c.o 00:01:52.018 [387/707] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.018 [388/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:52.018 [389/707] Linking static target lib/librte_graph.a 00:01:52.018 [390/707] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:52.018 [391/707] Generating lib/mldev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.018 [392/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:01:52.018 [393/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:01:52.018 [394/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:01:52.278 [395/707] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:52.278 [396/707] Compiling C object lib/acl/libavx2_tmp.a.p/acl_run_avx2.c.o 00:01:52.278 [397/707] Linking static target lib/acl/libavx2_tmp.a 00:01:52.278 [398/707] Linking static target lib/librte_hash.a 00:01:52.278 [399/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:01:52.278 [400/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:52.278 [401/707] Compiling C object lib/librte_node.a.p/node_udp4_input.c.o 00:01:52.278 [402/707] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:52.278 [403/707] Compiling C object app/dpdk-graph.p/graph_cli.c.o 00:01:52.278 [404/707] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:01:52.278 [405/707] Compiling C object app/dpdk-graph.p/graph_ip4_route.c.o 00:01:52.278 [406/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:01:52.278 [407/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:01:52.278 [408/707] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:52.278 [409/707] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:01:52.278 [410/707] Linking static target lib/librte_member.a 00:01:52.278 [411/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_recycle_mbufs_vec_common.c.o 00:01:52.278 [412/707] Linking static target lib/librte_table.a 00:01:52.278 [413/707] Compiling C object app/dpdk-graph.p/graph_l3fwd.c.o 00:01:52.278 [414/707] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:52.278 [415/707] Linking static target drivers/librte_bus_vdev.a 00:01:52.278 [416/707] Compiling C object app/dpdk-graph.p/graph_ip6_route.c.o 00:01:52.538 [417/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:01:52.538 [418/707] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:52.538 [419/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:01:52.538 [420/707] Compiling C object lib/librte_node.a.p/node_kernel_rx.c.o 00:01:52.538 [421/707] Linking static target lib/librte_ipsec.a 00:01:52.538 [422/707] Compiling C object app/dpdk-graph.p/graph_conn.c.o 00:01:52.538 [423/707] Compiling C object app/dpdk-graph.p/graph_ethdev_rx.c.o 00:01:52.538 [424/707] Compiling C object lib/librte_node.a.p/node_ip6_lookup.c.o 00:01:52.538 [425/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:01:52.538 [426/707] Compiling C object app/dpdk-graph.p/graph_mempool.c.o 00:01:52.538 [427/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:01:52.538 [428/707] Compiling C object app/dpdk-graph.p/graph_main.c.o 00:01:52.538 [429/707] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.538 [430/707] Compiling C object app/dpdk-graph.p/graph_utils.c.o 00:01:52.538 [431/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:01:52.538 [432/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ipsec.c.o 00:01:52.539 [433/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:01:52.539 [434/707] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:01:52.539 [435/707] Linking static target lib/librte_eventdev.a 00:01:52.800 [436/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:01:52.800 [437/707] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:52.800 [438/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:01:52.800 [439/707] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:52.800 [440/707] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:01:52.800 [441/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:01:52.800 [442/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:01:52.800 [443/707] Linking static target drivers/librte_bus_pci.a 00:01:52.800 [444/707] Compiling C object app/dpdk-graph.p/graph_graph.c.o 00:01:52.800 [445/707] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:52.800 [446/707] Linking static target lib/librte_acl.a 00:01:52.800 [447/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:01:52.800 [448/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_process.c.o 00:01:52.800 [449/707] Compiling C object app/dpdk-graph.p/graph_ethdev.c.o 00:01:52.800 [450/707] Linking static target lib/librte_pdcp.a 00:01:52.800 [451/707] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.801 [452/707] Compiling C object app/dpdk-graph.p/graph_neigh.c.o 00:01:52.801 [453/707] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.062 [454/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:53.062 [455/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:01:53.062 [456/707] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:53.062 [457/707] Linking static target lib/librte_ethdev.a 00:01:53.062 [458/707] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:53.062 [459/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:01:53.062 [460/707] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.062 [461/707] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:01:53.062 [462/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:01:53.062 [463/707] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.062 [464/707] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:01:53.062 [465/707] Linking static target lib/librte_port.a 00:01:53.062 [466/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:01:53.062 [467/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_test.c.o 00:01:53.062 [468/707] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.324 [469/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:01:53.324 [470/707] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:01:53.324 [471/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:01:53.324 [472/707] Compiling C object lib/librte_node.a.p/node_ip6_rewrite.c.o 00:01:53.324 [473/707] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.324 [474/707] Linking static target lib/librte_node.a 00:01:53.324 [475/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:01:53.324 [476/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:01:53.324 [477/707] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:53.324 [478/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:01:53.324 [479/707] Generating lib/pdcp.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.324 [480/707] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:53.324 [481/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_parser.c.o 00:01:53.324 [482/707] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:53.324 [483/707] Linking static target drivers/librte_mempool_ring.a 00:01:53.587 [484/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:01:53.587 [485/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_main.c.o 00:01:53.587 [486/707] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.587 [487/707] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.587 [488/707] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.587 [489/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:01:53.587 [490/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:01:53.587 [491/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:01:53.848 [492/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_main.c.o 00:01:53.848 [493/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:01:53.848 [494/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_options.c.o 00:01:53.848 [495/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_common.c.o 00:01:53.848 [496/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:01:53.848 [497/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:01:53.848 [498/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:01:53.848 [499/707] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.848 [500/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_common.c.o 00:01:53.848 [501/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:01:53.848 [502/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:01:53.848 [503/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:01:53.848 [504/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:01:53.848 [505/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:01:53.848 [506/707] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:01:54.107 [507/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:01:54.107 [508/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:01:54.107 [509/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_device_ops.c.o 00:01:54.107 [510/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_ordered.c.o 00:01:54.107 [511/707] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:01:54.107 [512/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_interleave.c.o 00:01:54.107 [513/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:01:54.107 [514/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_ops.c.o 00:01:54.107 [515/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:01:54.107 [516/707] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:01:54.107 [517/707] Compiling C object drivers/net/i40e/libi40e_avx2_lib.a.p/i40e_rxtx_vec_avx2.c.o 00:01:54.107 [518/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_benchmark.c.o 00:01:54.107 [519/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:01:54.107 [520/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_stats.c.o 00:01:54.107 [521/707] Linking static target drivers/net/i40e/libi40e_avx2_lib.a 00:01:54.107 [522/707] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.107 [523/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:01:54.107 [524/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:01:54.107 [525/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:01:54.107 [526/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:01:54.366 [527/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:01:54.366 [528/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:01:54.366 [529/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:01:54.366 [530/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:01:54.366 [531/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:01:54.366 [532/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:01:54.366 [533/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:01:54.366 [534/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:01:54.366 [535/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:01:54.625 [536/707] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:01:54.625 [537/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:01:54.625 [538/707] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:01:54.625 [539/707] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:01:54.625 [540/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:01:54.626 [541/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:01:54.626 [542/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:01:54.626 [543/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:01:54.626 [544/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:01:54.626 [545/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_cman.c.o 00:01:54.626 [546/707] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:01:54.626 [547/707] Compiling C object app/dpdk-testpmd.p/test-pmd_recycle_mbufs.c.o 00:01:54.626 [548/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:01:54.626 [549/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:01:54.626 [550/707] Linking static target drivers/net/i40e/base/libi40e_base.a 00:01:54.626 [551/707] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:01:54.626 [552/707] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:01:54.626 [553/707] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:01:54.938 [554/707] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:01:54.938 [555/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:01:54.938 [556/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:01:54.938 [557/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:01:54.938 [558/707] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:01:54.938 [559/707] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:01:54.938 [560/707] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:01:54.938 [561/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:01:54.938 [562/707] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:01:54.938 [563/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:01:54.938 [564/707] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:01:54.938 [565/707] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:01:55.196 [566/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:01:55.196 [567/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:01:55.196 [568/707] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:01:55.196 [569/707] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:01:55.196 [570/707] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:01:55.454 [571/707] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:01:55.713 [572/707] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:01:55.971 [573/707] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:01:55.971 [574/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_common.c.o 00:01:55.971 [575/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:01:56.230 [576/707] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:01:56.230 [577/707] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.489 [578/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:01:56.489 [579/707] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:01:56.748 [580/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:01:57.315 [581/707] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:01:57.573 [582/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:01:57.573 [583/707] Linking static target drivers/libtmp_rte_net_i40e.a 00:01:57.573 [584/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:01:58.140 [585/707] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:01:58.140 [586/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:58.140 [587/707] Compiling C object drivers/librte_net_i40e.so.24.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:58.140 [588/707] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:58.140 [589/707] Linking static target drivers/librte_net_i40e.a 00:01:58.140 [590/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:01:58.707 [591/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:01:59.274 [592/707] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.211 [593/707] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.211 [594/707] Linking target lib/librte_eal.so.24.0 00:02:00.211 [595/707] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:02:00.469 [596/707] Linking target lib/librte_timer.so.24.0 00:02:00.469 [597/707] Linking target lib/librte_ring.so.24.0 00:02:00.469 [598/707] Linking target lib/librte_pci.so.24.0 00:02:00.469 [599/707] Linking target lib/librte_cfgfile.so.24.0 00:02:00.469 [600/707] Linking target lib/librte_meter.so.24.0 00:02:00.469 [601/707] Linking target lib/librte_jobstats.so.24.0 00:02:00.469 [602/707] Linking target lib/librte_rawdev.so.24.0 00:02:00.469 [603/707] Linking target lib/librte_stack.so.24.0 00:02:00.469 [604/707] Linking target lib/librte_dmadev.so.24.0 00:02:00.469 [605/707] Linking target drivers/librte_bus_vdev.so.24.0 00:02:00.469 [606/707] Linking target lib/librte_acl.so.24.0 00:02:00.469 [607/707] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:02:00.469 [608/707] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:02:00.469 [609/707] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:02:00.469 [610/707] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:02:00.469 [611/707] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:02:00.469 [612/707] Generating symbol file lib/librte_acl.so.24.0.p/librte_acl.so.24.0.symbols 00:02:00.469 [613/707] Generating symbol file drivers/librte_bus_vdev.so.24.0.p/librte_bus_vdev.so.24.0.symbols 00:02:00.469 [614/707] Linking target lib/librte_mempool.so.24.0 00:02:00.469 [615/707] Linking target lib/librte_rcu.so.24.0 00:02:00.469 [616/707] Linking target drivers/librte_bus_pci.so.24.0 00:02:00.727 [617/707] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:02:00.727 [618/707] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:02:00.727 [619/707] Linking target lib/librte_rib.so.24.0 00:02:00.727 [620/707] Linking target drivers/librte_mempool_ring.so.24.0 00:02:00.728 [621/707] Linking target lib/librte_mbuf.so.24.0 00:02:00.728 [622/707] Generating symbol file drivers/librte_bus_pci.so.24.0.p/librte_bus_pci.so.24.0.symbols 00:02:00.986 [623/707] Generating symbol file lib/librte_rib.so.24.0.p/librte_rib.so.24.0.symbols 00:02:00.986 [624/707] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:02:00.986 [625/707] Linking target lib/librte_fib.so.24.0 00:02:00.986 [626/707] Linking target lib/librte_bbdev.so.24.0 00:02:00.986 [627/707] Linking target lib/librte_compressdev.so.24.0 00:02:00.986 [628/707] Linking target lib/librte_distributor.so.24.0 00:02:00.986 [629/707] Linking target lib/librte_net.so.24.0 00:02:00.986 [630/707] Linking target lib/librte_sched.so.24.0 00:02:00.986 [631/707] Linking target lib/librte_gpudev.so.24.0 00:02:00.986 [632/707] Linking target lib/librte_cryptodev.so.24.0 00:02:00.986 [633/707] Linking target lib/librte_regexdev.so.24.0 00:02:00.986 [634/707] Linking target lib/librte_reorder.so.24.0 00:02:00.986 [635/707] Linking target lib/librte_mldev.so.24.0 00:02:00.986 [636/707] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:02:00.986 [637/707] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:02:01.244 [638/707] Generating symbol file lib/librte_sched.so.24.0.p/librte_sched.so.24.0.symbols 00:02:01.244 [639/707] Generating symbol file lib/librte_reorder.so.24.0.p/librte_reorder.so.24.0.symbols 00:02:01.244 [640/707] Linking target lib/librte_security.so.24.0 00:02:01.244 [641/707] Linking target lib/librte_hash.so.24.0 00:02:01.244 [642/707] Linking target lib/librte_cmdline.so.24.0 00:02:01.244 [643/707] Generating symbol file lib/librte_security.so.24.0.p/librte_security.so.24.0.symbols 00:02:01.244 [644/707] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:02:01.244 [645/707] Linking target lib/librte_efd.so.24.0 00:02:01.502 [646/707] Linking target lib/librte_ipsec.so.24.0 00:02:01.502 [647/707] Linking target lib/librte_member.so.24.0 00:02:01.502 [648/707] Linking target lib/librte_lpm.so.24.0 00:02:01.502 [649/707] Linking target lib/librte_pdcp.so.24.0 00:02:01.502 [650/707] Generating symbol file lib/librte_ipsec.so.24.0.p/librte_ipsec.so.24.0.symbols 00:02:01.502 [651/707] Generating symbol file lib/librte_lpm.so.24.0.p/librte_lpm.so.24.0.symbols 00:02:01.761 [652/707] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.761 [653/707] Linking target lib/librte_ethdev.so.24.0 00:02:02.019 [654/707] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:02:02.019 [655/707] Linking target lib/librte_metrics.so.24.0 00:02:02.019 [656/707] Linking target lib/librte_pcapng.so.24.0 00:02:02.019 [657/707] Linking target lib/librte_gso.so.24.0 00:02:02.019 [658/707] Linking target lib/librte_gro.so.24.0 00:02:02.019 [659/707] Linking target lib/librte_ip_frag.so.24.0 00:02:02.019 [660/707] Linking target lib/librte_power.so.24.0 00:02:02.019 [661/707] Linking target lib/librte_bpf.so.24.0 00:02:02.019 [662/707] Linking target lib/librte_eventdev.so.24.0 00:02:02.019 [663/707] Linking target drivers/librte_net_i40e.so.24.0 00:02:02.277 [664/707] Generating symbol file lib/librte_pcapng.so.24.0.p/librte_pcapng.so.24.0.symbols 00:02:02.277 [665/707] Generating symbol file lib/librte_metrics.so.24.0.p/librte_metrics.so.24.0.symbols 00:02:02.277 [666/707] Generating symbol file lib/librte_ip_frag.so.24.0.p/librte_ip_frag.so.24.0.symbols 00:02:02.277 [667/707] Generating symbol file lib/librte_bpf.so.24.0.p/librte_bpf.so.24.0.symbols 00:02:02.277 [668/707] Generating symbol file lib/librte_eventdev.so.24.0.p/librte_eventdev.so.24.0.symbols 00:02:02.277 [669/707] Linking target lib/librte_bitratestats.so.24.0 00:02:02.277 [670/707] Linking target lib/librte_graph.so.24.0 00:02:02.277 [671/707] Linking target lib/librte_latencystats.so.24.0 00:02:02.277 [672/707] Linking target lib/librte_pdump.so.24.0 00:02:02.277 [673/707] Linking target lib/librte_dispatcher.so.24.0 00:02:02.277 [674/707] Linking target lib/librte_port.so.24.0 00:02:02.536 [675/707] Generating symbol file lib/librte_graph.so.24.0.p/librte_graph.so.24.0.symbols 00:02:02.536 [676/707] Generating symbol file lib/librte_port.so.24.0.p/librte_port.so.24.0.symbols 00:02:02.536 [677/707] Linking target lib/librte_node.so.24.0 00:02:02.536 [678/707] Linking target lib/librte_table.so.24.0 00:02:02.795 [679/707] Generating symbol file lib/librte_table.so.24.0.p/librte_table.so.24.0.symbols 00:02:08.064 [680/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:02:08.064 [681/707] Linking static target lib/librte_pipeline.a 00:02:09.001 [682/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:09.260 [683/707] Linking static target lib/librte_vhost.a 00:02:09.825 [684/707] Linking target app/dpdk-pdump 00:02:09.825 [685/707] Linking target app/dpdk-test-sad 00:02:09.825 [686/707] Linking target app/dpdk-test-dma-perf 00:02:09.825 [687/707] Linking target app/dpdk-test-cmdline 00:02:09.825 [688/707] Linking target app/dpdk-dumpcap 00:02:09.825 [689/707] Linking target app/dpdk-proc-info 00:02:09.825 [690/707] Linking target app/dpdk-test-acl 00:02:09.825 [691/707] Linking target app/dpdk-graph 00:02:09.825 [692/707] Linking target app/dpdk-test-fib 00:02:09.825 [693/707] Linking target app/dpdk-test-mldev 00:02:09.825 [694/707] Linking target app/dpdk-test-regex 00:02:09.825 [695/707] Linking target app/dpdk-test-security-perf 00:02:09.825 [696/707] Linking target app/dpdk-test-gpudev 00:02:09.825 [697/707] Linking target app/dpdk-test-flow-perf 00:02:09.825 [698/707] Linking target app/dpdk-test-compress-perf 00:02:09.825 [699/707] Linking target app/dpdk-test-pipeline 00:02:09.826 [700/707] Linking target app/dpdk-test-crypto-perf 00:02:09.826 [701/707] Linking target app/dpdk-test-bbdev 00:02:09.826 [702/707] Linking target app/dpdk-test-eventdev 00:02:09.826 [703/707] Linking target app/dpdk-testpmd 00:02:11.201 [704/707] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:11.460 [705/707] Linking target lib/librte_vhost.so.24.0 00:02:13.991 [706/707] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:13.991 [707/707] Linking target lib/librte_pipeline.so.24.0 00:02:13.991 13:14:32 -- common/autobuild_common.sh@190 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j72 install 00:02:13.991 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:13.991 [0/1] Installing files. 00:02:14.255 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:02:14.255 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:14.256 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:14.257 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:14.258 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.259 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec_sa.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:14.260 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:14.520 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:14.520 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:14.520 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:14.520 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:14.520 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:14.520 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:14.520 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:14.520 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:14.520 Installing lib/librte_log.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_dispatcher.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.520 Installing lib/librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_mldev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_pdcp.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing lib/librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing drivers/librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:14.783 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing drivers/librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:14.783 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing drivers/librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:14.783 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.783 Installing drivers/librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:14.783 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.783 Installing app/dpdk-graph to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.783 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.783 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.783 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.783 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.783 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.783 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.783 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.783 Installing app/dpdk-test-dma-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.783 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.783 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.783 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.783 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.783 Installing app/dpdk-test-mldev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.783 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.783 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.783 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.783 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.783 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.783 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.783 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/log/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.783 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.783 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.783 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lock_annotations.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_stdatomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.784 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_dtls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_pdcp_hdr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.785 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_dma_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dispatcher/rte_dispatcher.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.786 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_mcore_dispatch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_rtc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.787 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.788 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.788 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip6_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.788 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_udp4_input_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.788 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.788 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.788 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.788 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/dpdk-cmdline-gen.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.788 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.788 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.788 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.788 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.788 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-rss-flows.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:14.788 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.788 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:14.788 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:14.788 Installing symlink pointing to librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so.24 00:02:14.788 Installing symlink pointing to librte_log.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so 00:02:14.788 Installing symlink pointing to librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.24 00:02:14.788 Installing symlink pointing to librte_kvargs.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:02:14.788 Installing symlink pointing to librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.24 00:02:14.788 Installing symlink pointing to librte_telemetry.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:02:14.788 Installing symlink pointing to librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.24 00:02:14.788 Installing symlink pointing to librte_eal.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:02:14.788 Installing symlink pointing to librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.24 00:02:14.788 Installing symlink pointing to librte_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:02:14.788 Installing symlink pointing to librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.24 00:02:14.788 Installing symlink pointing to librte_rcu.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:02:14.788 Installing symlink pointing to librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.24 00:02:14.788 Installing symlink pointing to librte_mempool.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:02:14.788 Installing symlink pointing to librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.24 00:02:14.788 Installing symlink pointing to librte_mbuf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:02:14.788 Installing symlink pointing to librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.24 00:02:14.788 Installing symlink pointing to librte_net.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:02:14.788 Installing symlink pointing to librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.24 00:02:14.788 Installing symlink pointing to librte_meter.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:02:14.788 Installing symlink pointing to librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.24 00:02:14.788 Installing symlink pointing to librte_ethdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:02:14.788 Installing symlink pointing to librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.24 00:02:14.788 Installing symlink pointing to librte_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:02:14.788 Installing symlink pointing to librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.24 00:02:14.788 Installing symlink pointing to librte_cmdline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:02:14.788 Installing symlink pointing to librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.24 00:02:14.788 Installing symlink pointing to librte_metrics.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:02:14.788 Installing symlink pointing to librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.24 00:02:14.788 Installing symlink pointing to librte_hash.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:02:14.788 Installing symlink pointing to librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.24 00:02:14.788 Installing symlink pointing to librte_timer.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:02:14.788 Installing symlink pointing to librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.24 00:02:14.788 Installing symlink pointing to librte_acl.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:02:14.788 Installing symlink pointing to librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.24 00:02:14.788 Installing symlink pointing to librte_bbdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:02:14.788 Installing symlink pointing to librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.24 00:02:14.788 Installing symlink pointing to librte_bitratestats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:02:14.788 Installing symlink pointing to librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.24 00:02:14.788 Installing symlink pointing to librte_bpf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:02:14.788 Installing symlink pointing to librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.24 00:02:14.788 Installing symlink pointing to librte_cfgfile.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:02:14.788 Installing symlink pointing to librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.24 00:02:14.788 Installing symlink pointing to librte_compressdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:02:14.788 Installing symlink pointing to librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.24 00:02:14.788 Installing symlink pointing to librte_cryptodev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:02:14.788 Installing symlink pointing to librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.24 00:02:14.788 Installing symlink pointing to librte_distributor.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:02:14.788 Installing symlink pointing to librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.24 00:02:14.788 Installing symlink pointing to librte_dmadev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:02:14.788 Installing symlink pointing to librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.24 00:02:14.788 Installing symlink pointing to librte_efd.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:02:14.788 Installing symlink pointing to librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.24 00:02:14.788 Installing symlink pointing to librte_eventdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:02:14.788 Installing symlink pointing to librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so.24 00:02:14.788 Installing symlink pointing to librte_dispatcher.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so 00:02:14.788 Installing symlink pointing to librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.24 00:02:14.788 Installing symlink pointing to librte_gpudev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:02:14.788 Installing symlink pointing to librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.24 00:02:14.788 Installing symlink pointing to librte_gro.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:02:14.788 Installing symlink pointing to librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.24 00:02:14.788 Installing symlink pointing to librte_gso.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:02:14.788 Installing symlink pointing to librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.24 00:02:14.789 Installing symlink pointing to librte_ip_frag.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:02:14.789 Installing symlink pointing to librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.24 00:02:14.789 Installing symlink pointing to librte_jobstats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:02:14.789 Installing symlink pointing to librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.24 00:02:14.789 Installing symlink pointing to librte_latencystats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:02:14.789 Installing symlink pointing to librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.24 00:02:14.789 Installing symlink pointing to librte_lpm.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:02:14.789 Installing symlink pointing to librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.24 00:02:14.789 Installing symlink pointing to librte_member.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:02:14.789 Installing symlink pointing to librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.24 00:02:14.789 Installing symlink pointing to librte_pcapng.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:02:14.789 Installing symlink pointing to librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.24 00:02:14.789 Installing symlink pointing to librte_power.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:02:14.789 Installing symlink pointing to librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.24 00:02:14.789 Installing symlink pointing to librte_rawdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:02:14.789 Installing symlink pointing to librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.24 00:02:14.789 Installing symlink pointing to librte_regexdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:02:14.789 Installing symlink pointing to librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so.24 00:02:14.789 Installing symlink pointing to librte_mldev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so 00:02:14.789 Installing symlink pointing to librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.24 00:02:14.789 Installing symlink pointing to librte_rib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:02:14.789 Installing symlink pointing to librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.24 00:02:14.789 Installing symlink pointing to librte_reorder.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:02:14.789 Installing symlink pointing to librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.24 00:02:14.789 Installing symlink pointing to librte_sched.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:02:14.789 Installing symlink pointing to librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.24 00:02:14.789 Installing symlink pointing to librte_security.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:02:14.789 Installing symlink pointing to librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.24 00:02:14.789 Installing symlink pointing to librte_stack.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:02:14.789 Installing symlink pointing to librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.24 00:02:14.789 Installing symlink pointing to librte_vhost.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:02:14.789 Installing symlink pointing to librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.24 00:02:14.789 Installing symlink pointing to librte_ipsec.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:02:14.789 Installing symlink pointing to librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so.24 00:02:14.789 Installing symlink pointing to librte_pdcp.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so 00:02:14.789 Installing symlink pointing to librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.24 00:02:14.789 Installing symlink pointing to librte_fib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:02:14.789 Installing symlink pointing to librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.24 00:02:14.789 Installing symlink pointing to librte_port.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:02:14.789 Installing symlink pointing to librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.24 00:02:14.789 Installing symlink pointing to librte_pdump.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:02:14.789 Installing symlink pointing to librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.24 00:02:14.789 Installing symlink pointing to librte_table.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:02:14.789 Installing symlink pointing to librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.24 00:02:14.789 Installing symlink pointing to librte_pipeline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:02:14.789 Installing symlink pointing to librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.24 00:02:14.789 Installing symlink pointing to librte_graph.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:02:14.789 Installing symlink pointing to librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.24 00:02:14.789 Installing symlink pointing to librte_node.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:02:14.789 './librte_bus_pci.so' -> 'dpdk/pmds-24.0/librte_bus_pci.so' 00:02:14.789 './librte_bus_pci.so.24' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24' 00:02:14.789 './librte_bus_pci.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24.0' 00:02:14.789 './librte_bus_vdev.so' -> 'dpdk/pmds-24.0/librte_bus_vdev.so' 00:02:14.789 './librte_bus_vdev.so.24' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24' 00:02:14.789 './librte_bus_vdev.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24.0' 00:02:14.789 './librte_mempool_ring.so' -> 'dpdk/pmds-24.0/librte_mempool_ring.so' 00:02:14.789 './librte_mempool_ring.so.24' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24' 00:02:14.789 './librte_mempool_ring.so.24.0' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24.0' 00:02:14.789 './librte_net_i40e.so' -> 'dpdk/pmds-24.0/librte_net_i40e.so' 00:02:14.789 './librte_net_i40e.so.24' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24' 00:02:14.789 './librte_net_i40e.so.24.0' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24.0' 00:02:14.789 Installing symlink pointing to librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so.24 00:02:14.789 Installing symlink pointing to librte_bus_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so 00:02:14.789 Installing symlink pointing to librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so.24 00:02:14.789 Installing symlink pointing to librte_bus_vdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so 00:02:14.789 Installing symlink pointing to librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so.24 00:02:14.789 Installing symlink pointing to librte_mempool_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so 00:02:14.789 Installing symlink pointing to librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so.24 00:02:14.789 Installing symlink pointing to librte_net_i40e.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so 00:02:14.789 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-24.0' 00:02:15.048 13:14:33 -- common/autobuild_common.sh@192 -- $ uname -s 00:02:15.048 13:14:33 -- common/autobuild_common.sh@192 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:15.048 13:14:33 -- common/autobuild_common.sh@203 -- $ cat 00:02:15.048 13:14:33 -- common/autobuild_common.sh@208 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:15.048 00:02:15.048 real 0m36.028s 00:02:15.048 user 9m50.237s 00:02:15.048 sys 2m12.354s 00:02:15.048 13:14:33 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:15.048 13:14:33 -- common/autotest_common.sh@10 -- $ set +x 00:02:15.048 ************************************ 00:02:15.048 END TEST build_native_dpdk 00:02:15.048 ************************************ 00:02:15.048 13:14:33 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:15.048 13:14:33 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:15.048 13:14:33 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:02:15.048 13:14:33 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:02:15.048 13:14:33 -- common/autobuild_common.sh@426 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:02:15.048 13:14:33 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:02:15.048 13:14:33 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:02:15.048 13:14:33 -- common/autotest_common.sh@10 -- $ set +x 00:02:15.048 ************************************ 00:02:15.048 START TEST autobuild_llvm_precompile 00:02:15.048 ************************************ 00:02:15.048 13:14:33 -- common/autotest_common.sh@1104 -- $ _llvm_precompile 00:02:15.048 13:14:33 -- common/autobuild_common.sh@32 -- $ clang --version 00:02:15.048 13:14:33 -- common/autobuild_common.sh@32 -- $ [[ clang version 16.0.6 (Fedora 16.0.6-3.fc38) 00:02:15.048 Target: x86_64-redhat-linux-gnu 00:02:15.048 Thread model: posix 00:02:15.048 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:02:15.048 13:14:33 -- common/autobuild_common.sh@33 -- $ clang_num=16 00:02:15.048 13:14:33 -- common/autobuild_common.sh@35 -- $ export CC=clang-16 00:02:15.048 13:14:33 -- common/autobuild_common.sh@35 -- $ CC=clang-16 00:02:15.048 13:14:33 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-16 00:02:15.048 13:14:33 -- common/autobuild_common.sh@36 -- $ CXX=clang++-16 00:02:15.048 13:14:33 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a) 00:02:15.048 13:14:33 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:02:15.048 13:14:33 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a ]] 00:02:15.048 13:14:33 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a' 00:02:15.048 13:14:33 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:02:15.310 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:15.625 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.625 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.625 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:15.884 Using 'verbs' RDMA provider 00:02:31.754 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:46.637 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:47.203 Creating mk/config.mk...done. 00:02:47.203 Creating mk/cc.flags.mk...done. 00:02:47.203 Type 'make' to build. 00:02:47.204 00:02:47.204 real 0m32.145s 00:02:47.204 user 0m14.736s 00:02:47.204 sys 0m16.812s 00:02:47.204 13:15:05 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:47.204 13:15:05 -- common/autotest_common.sh@10 -- $ set +x 00:02:47.204 ************************************ 00:02:47.204 END TEST autobuild_llvm_precompile 00:02:47.204 ************************************ 00:02:47.204 13:15:05 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:47.204 13:15:05 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:47.204 13:15:05 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:47.204 13:15:05 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:47.204 13:15:05 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:02:47.462 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:47.720 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:47.720 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:47.720 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:48.288 Using 'verbs' RDMA provider 00:03:03.742 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:03:16.029 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:03:16.029 Creating mk/config.mk...done. 00:03:16.029 Creating mk/cc.flags.mk...done. 00:03:16.029 Type 'make' to build. 00:03:16.029 13:15:33 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:03:16.029 13:15:33 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:03:16.029 13:15:33 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:03:16.029 13:15:33 -- common/autotest_common.sh@10 -- $ set +x 00:03:16.029 ************************************ 00:03:16.029 START TEST make 00:03:16.029 ************************************ 00:03:16.029 13:15:33 -- common/autotest_common.sh@1104 -- $ make -j72 00:03:16.029 make[1]: Nothing to be done for 'all'. 00:03:17.405 The Meson build system 00:03:17.405 Version: 1.3.1 00:03:17.405 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:03:17.405 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:17.405 Build type: native build 00:03:17.405 Project name: libvfio-user 00:03:17.405 Project version: 0.0.1 00:03:17.405 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:03:17.405 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:03:17.405 Host machine cpu family: x86_64 00:03:17.405 Host machine cpu: x86_64 00:03:17.405 Run-time dependency threads found: YES 00:03:17.405 Library dl found: YES 00:03:17.405 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:03:17.405 Run-time dependency json-c found: YES 0.17 00:03:17.405 Run-time dependency cmocka found: YES 1.1.7 00:03:17.405 Program pytest-3 found: NO 00:03:17.405 Program flake8 found: NO 00:03:17.405 Program misspell-fixer found: NO 00:03:17.405 Program restructuredtext-lint found: NO 00:03:17.405 Program valgrind found: YES (/usr/bin/valgrind) 00:03:17.405 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:17.405 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:17.405 Compiler for C supports arguments -Wwrite-strings: YES 00:03:17.405 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:17.405 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:03:17.405 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:03:17.405 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:17.405 Build targets in project: 8 00:03:17.405 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:03:17.405 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:03:17.405 00:03:17.405 libvfio-user 0.0.1 00:03:17.405 00:03:17.405 User defined options 00:03:17.405 buildtype : debug 00:03:17.405 default_library: static 00:03:17.405 libdir : /usr/local/lib 00:03:17.405 00:03:17.405 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:17.663 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:17.921 [1/36] Compiling C object samples/lspci.p/lspci.c.o 00:03:17.921 [2/36] Compiling C object samples/null.p/null.c.o 00:03:17.921 [3/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:03:17.921 [4/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:03:17.921 [5/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:03:17.921 [6/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:03:17.921 [7/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:03:17.921 [8/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:03:17.921 [9/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:03:17.921 [10/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:03:17.921 [11/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:03:17.921 [12/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:03:17.921 [13/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:03:17.921 [14/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:03:17.921 [15/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:03:17.921 [16/36] Compiling C object test/unit_tests.p/mocks.c.o 00:03:17.921 [17/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:03:17.921 [18/36] Compiling C object samples/server.p/server.c.o 00:03:17.921 [19/36] Compiling C object samples/client.p/client.c.o 00:03:17.921 [20/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:03:17.921 [21/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:03:17.921 [22/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:03:17.921 [23/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:03:17.921 [24/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:03:17.921 [25/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:03:17.921 [26/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:03:17.921 [27/36] Linking target samples/client 00:03:17.921 [28/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:03:17.921 [29/36] Linking static target lib/libvfio-user.a 00:03:17.921 [30/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:03:17.921 [31/36] Linking target samples/null 00:03:17.921 [32/36] Linking target samples/shadow_ioeventfd_server 00:03:17.921 [33/36] Linking target samples/server 00:03:17.921 [34/36] Linking target samples/gpio-pci-idio-16 00:03:17.921 [35/36] Linking target test/unit_tests 00:03:17.921 [36/36] Linking target samples/lspci 00:03:17.921 INFO: autodetecting backend as ninja 00:03:17.921 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:18.180 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:18.439 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:18.439 ninja: no work to do. 00:03:23.712 CC lib/ut_mock/mock.o 00:03:23.712 CC lib/ut/ut.o 00:03:23.712 CC lib/log/log.o 00:03:23.712 CC lib/log/log_flags.o 00:03:23.712 CC lib/log/log_deprecated.o 00:03:23.712 LIB libspdk_ut_mock.a 00:03:23.712 LIB libspdk_ut.a 00:03:23.712 LIB libspdk_log.a 00:03:23.970 CXX lib/trace_parser/trace.o 00:03:23.970 CC lib/ioat/ioat.o 00:03:23.970 CC lib/util/base64.o 00:03:23.970 CC lib/util/cpuset.o 00:03:23.970 CC lib/util/bit_array.o 00:03:23.970 CC lib/util/crc16.o 00:03:23.970 CC lib/util/crc32.o 00:03:23.970 CC lib/dma/dma.o 00:03:23.970 CC lib/util/crc32c.o 00:03:23.970 CC lib/util/crc32_ieee.o 00:03:23.970 CC lib/util/crc64.o 00:03:23.970 CC lib/util/dif.o 00:03:23.970 CC lib/util/fd.o 00:03:23.970 CC lib/util/file.o 00:03:23.970 CC lib/util/hexlify.o 00:03:23.970 CC lib/util/iov.o 00:03:23.970 CC lib/util/math.o 00:03:23.970 CC lib/util/pipe.o 00:03:23.970 CC lib/util/string.o 00:03:23.970 CC lib/util/strerror_tls.o 00:03:23.970 CC lib/util/uuid.o 00:03:23.970 CC lib/util/fd_group.o 00:03:23.970 CC lib/util/zipf.o 00:03:23.970 CC lib/util/xor.o 00:03:24.229 CC lib/vfio_user/host/vfio_user_pci.o 00:03:24.229 CC lib/vfio_user/host/vfio_user.o 00:03:24.229 LIB libspdk_dma.a 00:03:24.229 LIB libspdk_ioat.a 00:03:24.229 LIB libspdk_vfio_user.a 00:03:24.487 LIB libspdk_util.a 00:03:24.745 LIB libspdk_trace_parser.a 00:03:24.745 CC lib/vmd/vmd.o 00:03:24.745 CC lib/vmd/led.o 00:03:24.745 CC lib/rdma/common.o 00:03:24.745 CC lib/env_dpdk/env.o 00:03:24.745 CC lib/rdma/rdma_verbs.o 00:03:24.745 CC lib/env_dpdk/memory.o 00:03:24.745 CC lib/env_dpdk/pci.o 00:03:24.745 CC lib/env_dpdk/init.o 00:03:24.745 CC lib/env_dpdk/threads.o 00:03:24.745 CC lib/env_dpdk/pci_ioat.o 00:03:24.745 CC lib/env_dpdk/pci_virtio.o 00:03:24.745 CC lib/idxd/idxd.o 00:03:24.745 CC lib/env_dpdk/pci_vmd.o 00:03:24.745 CC lib/conf/conf.o 00:03:24.745 CC lib/idxd/idxd_user.o 00:03:24.745 CC lib/env_dpdk/pci_idxd.o 00:03:24.745 CC lib/idxd/idxd_kernel.o 00:03:24.745 CC lib/env_dpdk/pci_event.o 00:03:24.745 CC lib/env_dpdk/sigbus_handler.o 00:03:24.745 CC lib/env_dpdk/pci_dpdk.o 00:03:24.745 CC lib/json/json_parse.o 00:03:24.745 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:24.745 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:24.745 CC lib/json/json_write.o 00:03:24.745 CC lib/json/json_util.o 00:03:25.003 LIB libspdk_conf.a 00:03:25.003 LIB libspdk_rdma.a 00:03:25.003 LIB libspdk_json.a 00:03:25.262 LIB libspdk_vmd.a 00:03:25.262 LIB libspdk_idxd.a 00:03:25.262 CC lib/jsonrpc/jsonrpc_server.o 00:03:25.262 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:25.262 CC lib/jsonrpc/jsonrpc_client.o 00:03:25.262 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:25.520 LIB libspdk_jsonrpc.a 00:03:26.089 CC lib/rpc/rpc.o 00:03:26.089 LIB libspdk_env_dpdk.a 00:03:26.089 LIB libspdk_rpc.a 00:03:26.349 CC lib/sock/sock.o 00:03:26.349 CC lib/sock/sock_rpc.o 00:03:26.349 CC lib/trace/trace.o 00:03:26.349 CC lib/trace/trace_flags.o 00:03:26.349 CC lib/trace/trace_rpc.o 00:03:26.349 CC lib/notify/notify_rpc.o 00:03:26.349 CC lib/notify/notify.o 00:03:26.608 LIB libspdk_notify.a 00:03:26.608 LIB libspdk_trace.a 00:03:26.608 LIB libspdk_sock.a 00:03:26.868 CC lib/thread/thread.o 00:03:26.868 CC lib/thread/iobuf.o 00:03:27.126 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:27.126 CC lib/nvme/nvme_ctrlr.o 00:03:27.126 CC lib/nvme/nvme_fabric.o 00:03:27.126 CC lib/nvme/nvme_ns.o 00:03:27.126 CC lib/nvme/nvme_ns_cmd.o 00:03:27.126 CC lib/nvme/nvme_pcie.o 00:03:27.126 CC lib/nvme/nvme_pcie_common.o 00:03:27.126 CC lib/nvme/nvme_qpair.o 00:03:27.127 CC lib/nvme/nvme.o 00:03:27.127 CC lib/nvme/nvme_quirks.o 00:03:27.127 CC lib/nvme/nvme_transport.o 00:03:27.127 CC lib/nvme/nvme_discovery.o 00:03:27.127 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:27.127 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:27.127 CC lib/nvme/nvme_tcp.o 00:03:27.127 CC lib/nvme/nvme_opal.o 00:03:27.127 CC lib/nvme/nvme_io_msg.o 00:03:27.127 CC lib/nvme/nvme_poll_group.o 00:03:27.127 CC lib/nvme/nvme_zns.o 00:03:27.127 CC lib/nvme/nvme_cuse.o 00:03:27.127 CC lib/nvme/nvme_vfio_user.o 00:03:27.127 CC lib/nvme/nvme_rdma.o 00:03:28.063 LIB libspdk_thread.a 00:03:28.321 CC lib/blob/request.o 00:03:28.321 CC lib/blob/blobstore.o 00:03:28.321 CC lib/blob/blob_bs_dev.o 00:03:28.321 CC lib/blob/zeroes.o 00:03:28.321 CC lib/vfu_tgt/tgt_endpoint.o 00:03:28.321 CC lib/vfu_tgt/tgt_rpc.o 00:03:28.321 CC lib/virtio/virtio.o 00:03:28.321 CC lib/virtio/virtio_vhost_user.o 00:03:28.321 CC lib/virtio/virtio_vfio_user.o 00:03:28.321 CC lib/init/json_config.o 00:03:28.321 CC lib/virtio/virtio_pci.o 00:03:28.321 CC lib/init/subsystem.o 00:03:28.321 CC lib/init/subsystem_rpc.o 00:03:28.321 CC lib/init/rpc.o 00:03:28.321 CC lib/accel/accel.o 00:03:28.321 CC lib/accel/accel_rpc.o 00:03:28.321 CC lib/accel/accel_sw.o 00:03:28.581 LIB libspdk_init.a 00:03:28.581 LIB libspdk_vfu_tgt.a 00:03:28.581 LIB libspdk_virtio.a 00:03:28.840 LIB libspdk_nvme.a 00:03:28.840 CC lib/event/app.o 00:03:28.840 CC lib/event/reactor.o 00:03:28.840 CC lib/event/log_rpc.o 00:03:28.840 CC lib/event/app_rpc.o 00:03:28.840 CC lib/event/scheduler_static.o 00:03:29.407 LIB libspdk_event.a 00:03:29.407 LIB libspdk_accel.a 00:03:29.665 CC lib/bdev/bdev.o 00:03:29.665 CC lib/bdev/bdev_rpc.o 00:03:29.665 CC lib/bdev/bdev_zone.o 00:03:29.665 CC lib/bdev/scsi_nvme.o 00:03:29.665 CC lib/bdev/part.o 00:03:30.601 LIB libspdk_blob.a 00:03:30.860 CC lib/blobfs/blobfs.o 00:03:30.860 CC lib/blobfs/tree.o 00:03:31.119 CC lib/lvol/lvol.o 00:03:31.378 LIB libspdk_bdev.a 00:03:31.638 LIB libspdk_lvol.a 00:03:31.638 LIB libspdk_blobfs.a 00:03:31.638 CC lib/scsi/port.o 00:03:31.638 CC lib/scsi/dev.o 00:03:31.638 CC lib/scsi/lun.o 00:03:31.638 CC lib/scsi/scsi.o 00:03:31.638 CC lib/scsi/scsi_bdev.o 00:03:31.638 CC lib/scsi/scsi_pr.o 00:03:31.638 CC lib/scsi/scsi_rpc.o 00:03:31.638 CC lib/scsi/task.o 00:03:31.638 CC lib/nvmf/ctrlr.o 00:03:31.638 CC lib/ublk/ublk.o 00:03:31.638 CC lib/nvmf/ctrlr_discovery.o 00:03:31.638 CC lib/ublk/ublk_rpc.o 00:03:31.638 CC lib/nvmf/ctrlr_bdev.o 00:03:31.638 CC lib/nvmf/subsystem.o 00:03:31.638 CC lib/nvmf/nvmf_rpc.o 00:03:31.638 CC lib/nvmf/nvmf.o 00:03:31.638 CC lib/nbd/nbd_rpc.o 00:03:31.638 CC lib/nvmf/tcp.o 00:03:31.638 CC lib/nvmf/transport.o 00:03:31.638 CC lib/nbd/nbd.o 00:03:31.638 CC lib/nvmf/vfio_user.o 00:03:31.638 CC lib/nvmf/rdma.o 00:03:31.638 CC lib/ftl/ftl_layout.o 00:03:31.638 CC lib/ftl/ftl_core.o 00:03:31.638 CC lib/ftl/ftl_init.o 00:03:31.638 CC lib/ftl/ftl_debug.o 00:03:31.638 CC lib/ftl/ftl_io.o 00:03:31.638 CC lib/ftl/ftl_sb.o 00:03:31.638 CC lib/ftl/ftl_l2p_flat.o 00:03:31.638 CC lib/ftl/ftl_l2p.o 00:03:31.897 CC lib/ftl/ftl_nv_cache.o 00:03:31.897 CC lib/ftl/ftl_band.o 00:03:31.897 CC lib/ftl/ftl_band_ops.o 00:03:31.897 CC lib/ftl/ftl_writer.o 00:03:31.897 CC lib/ftl/ftl_reloc.o 00:03:31.897 CC lib/ftl/ftl_rq.o 00:03:31.897 CC lib/ftl/ftl_l2p_cache.o 00:03:31.897 CC lib/ftl/ftl_p2l.o 00:03:31.897 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:31.898 CC lib/ftl/mngt/ftl_mngt.o 00:03:31.898 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:31.898 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:31.898 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:31.898 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:31.898 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:31.898 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:31.898 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:31.898 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:31.898 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:31.898 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:31.898 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:31.898 CC lib/ftl/utils/ftl_md.o 00:03:31.898 CC lib/ftl/utils/ftl_mempool.o 00:03:31.898 CC lib/ftl/utils/ftl_conf.o 00:03:31.898 CC lib/ftl/utils/ftl_bitmap.o 00:03:31.898 CC lib/ftl/utils/ftl_property.o 00:03:31.898 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:31.898 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:31.898 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:31.898 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:31.898 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:31.898 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:31.898 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:31.898 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:31.898 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:31.898 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:31.898 CC lib/ftl/base/ftl_base_dev.o 00:03:31.898 CC lib/ftl/base/ftl_base_bdev.o 00:03:31.898 CC lib/ftl/ftl_trace.o 00:03:32.466 LIB libspdk_nbd.a 00:03:32.466 LIB libspdk_scsi.a 00:03:32.466 LIB libspdk_ublk.a 00:03:32.467 LIB libspdk_ftl.a 00:03:32.725 CC lib/iscsi/conn.o 00:03:32.725 CC lib/vhost/vhost.o 00:03:32.725 CC lib/iscsi/init_grp.o 00:03:32.725 CC lib/vhost/vhost_rpc.o 00:03:32.725 CC lib/iscsi/iscsi.o 00:03:32.725 CC lib/iscsi/portal_grp.o 00:03:32.725 CC lib/vhost/vhost_scsi.o 00:03:32.725 CC lib/iscsi/md5.o 00:03:32.725 CC lib/vhost/vhost_blk.o 00:03:32.725 CC lib/iscsi/param.o 00:03:32.725 CC lib/vhost/rte_vhost_user.o 00:03:32.725 CC lib/iscsi/tgt_node.o 00:03:32.725 CC lib/iscsi/iscsi_rpc.o 00:03:32.725 CC lib/iscsi/iscsi_subsystem.o 00:03:32.725 CC lib/iscsi/task.o 00:03:33.661 LIB libspdk_iscsi.a 00:03:33.661 LIB libspdk_nvmf.a 00:03:33.661 LIB libspdk_vhost.a 00:03:33.920 CC module/env_dpdk/env_dpdk_rpc.o 00:03:33.920 CC module/vfu_device/vfu_virtio.o 00:03:33.920 CC module/vfu_device/vfu_virtio_blk.o 00:03:33.920 CC module/vfu_device/vfu_virtio_scsi.o 00:03:33.920 CC module/vfu_device/vfu_virtio_rpc.o 00:03:34.179 CC module/scheduler/gscheduler/gscheduler.o 00:03:34.179 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:34.179 CC module/blob/bdev/blob_bdev.o 00:03:34.179 CC module/accel/ioat/accel_ioat_rpc.o 00:03:34.179 CC module/accel/ioat/accel_ioat.o 00:03:34.179 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:34.179 CC module/accel/error/accel_error.o 00:03:34.179 CC module/accel/error/accel_error_rpc.o 00:03:34.179 CC module/accel/dsa/accel_dsa.o 00:03:34.179 CC module/accel/dsa/accel_dsa_rpc.o 00:03:34.179 CC module/accel/iaa/accel_iaa.o 00:03:34.179 CC module/accel/iaa/accel_iaa_rpc.o 00:03:34.179 CC module/sock/posix/posix.o 00:03:34.179 LIB libspdk_env_dpdk_rpc.a 00:03:34.179 LIB libspdk_scheduler_dpdk_governor.a 00:03:34.179 LIB libspdk_scheduler_gscheduler.a 00:03:34.179 LIB libspdk_accel_error.a 00:03:34.179 LIB libspdk_blob_bdev.a 00:03:34.438 LIB libspdk_scheduler_dynamic.a 00:03:34.438 LIB libspdk_accel_ioat.a 00:03:34.438 LIB libspdk_accel_iaa.a 00:03:34.438 LIB libspdk_accel_dsa.a 00:03:34.438 LIB libspdk_vfu_device.a 00:03:34.695 CC module/bdev/delay/vbdev_delay.o 00:03:34.695 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:34.695 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:34.695 CC module/blobfs/bdev/blobfs_bdev.o 00:03:34.695 CC module/bdev/error/vbdev_error_rpc.o 00:03:34.695 CC module/bdev/gpt/vbdev_gpt.o 00:03:34.695 CC module/bdev/gpt/gpt.o 00:03:34.695 CC module/bdev/error/vbdev_error.o 00:03:34.695 CC module/bdev/passthru/vbdev_passthru.o 00:03:34.695 CC module/bdev/split/vbdev_split.o 00:03:34.695 CC module/bdev/split/vbdev_split_rpc.o 00:03:34.695 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:34.695 CC module/bdev/malloc/bdev_malloc.o 00:03:34.695 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:34.695 CC module/bdev/raid/bdev_raid.o 00:03:34.695 CC module/bdev/null/bdev_null.o 00:03:34.695 CC module/bdev/raid/bdev_raid_sb.o 00:03:34.695 CC module/bdev/raid/bdev_raid_rpc.o 00:03:34.695 CC module/bdev/null/bdev_null_rpc.o 00:03:34.695 CC module/bdev/raid/concat.o 00:03:34.695 CC module/bdev/raid/raid0.o 00:03:34.695 CC module/bdev/raid/raid1.o 00:03:34.695 CC module/bdev/lvol/vbdev_lvol.o 00:03:34.695 CC module/bdev/ftl/bdev_ftl.o 00:03:34.695 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:34.695 LIB libspdk_sock_posix.a 00:03:34.695 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:34.695 CC module/bdev/nvme/bdev_nvme.o 00:03:34.695 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:34.695 CC module/bdev/nvme/nvme_rpc.o 00:03:34.695 CC module/bdev/nvme/bdev_mdns_client.o 00:03:34.695 CC module/bdev/iscsi/bdev_iscsi.o 00:03:34.695 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:34.695 CC module/bdev/nvme/vbdev_opal.o 00:03:34.695 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:34.695 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:34.695 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:34.695 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:34.695 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:34.695 CC module/bdev/aio/bdev_aio.o 00:03:34.695 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:34.695 CC module/bdev/aio/bdev_aio_rpc.o 00:03:34.695 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:34.952 LIB libspdk_blobfs_bdev.a 00:03:34.952 LIB libspdk_bdev_gpt.a 00:03:34.952 LIB libspdk_bdev_error.a 00:03:34.952 LIB libspdk_bdev_null.a 00:03:34.952 LIB libspdk_bdev_ftl.a 00:03:34.952 LIB libspdk_bdev_passthru.a 00:03:35.257 LIB libspdk_bdev_split.a 00:03:35.257 LIB libspdk_bdev_zone_block.a 00:03:35.257 LIB libspdk_bdev_delay.a 00:03:35.257 LIB libspdk_bdev_iscsi.a 00:03:35.257 LIB libspdk_bdev_aio.a 00:03:35.257 LIB libspdk_bdev_lvol.a 00:03:35.257 LIB libspdk_bdev_malloc.a 00:03:35.257 LIB libspdk_bdev_virtio.a 00:03:35.540 LIB libspdk_bdev_raid.a 00:03:36.477 LIB libspdk_bdev_nvme.a 00:03:37.044 CC module/event/subsystems/vmd/vmd.o 00:03:37.044 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:37.044 CC module/event/subsystems/sock/sock.o 00:03:37.044 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:03:37.044 CC module/event/subsystems/iobuf/iobuf.o 00:03:37.044 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:37.044 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:37.044 CC module/event/subsystems/scheduler/scheduler.o 00:03:37.303 LIB libspdk_event_vhost_blk.a 00:03:37.303 LIB libspdk_event_vmd.a 00:03:37.303 LIB libspdk_event_sock.a 00:03:37.303 LIB libspdk_event_vfu_tgt.a 00:03:37.303 LIB libspdk_event_scheduler.a 00:03:37.303 LIB libspdk_event_iobuf.a 00:03:37.562 CC module/event/subsystems/accel/accel.o 00:03:37.822 LIB libspdk_event_accel.a 00:03:38.081 CC module/event/subsystems/bdev/bdev.o 00:03:38.340 LIB libspdk_event_bdev.a 00:03:38.597 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:38.597 CC module/event/subsystems/ublk/ublk.o 00:03:38.597 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:38.597 CC module/event/subsystems/scsi/scsi.o 00:03:38.597 CC module/event/subsystems/nbd/nbd.o 00:03:38.597 LIB libspdk_event_ublk.a 00:03:38.856 LIB libspdk_event_scsi.a 00:03:38.856 LIB libspdk_event_nbd.a 00:03:38.856 LIB libspdk_event_nvmf.a 00:03:39.115 CC module/event/subsystems/iscsi/iscsi.o 00:03:39.115 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:39.115 LIB libspdk_event_iscsi.a 00:03:39.115 LIB libspdk_event_vhost_scsi.a 00:03:39.686 TEST_HEADER include/spdk/accel.h 00:03:39.686 CXX app/trace/trace.o 00:03:39.686 TEST_HEADER include/spdk/accel_module.h 00:03:39.687 CC test/rpc_client/rpc_client_test.o 00:03:39.687 TEST_HEADER include/spdk/assert.h 00:03:39.687 TEST_HEADER include/spdk/barrier.h 00:03:39.687 TEST_HEADER include/spdk/base64.h 00:03:39.687 CC app/trace_record/trace_record.o 00:03:39.687 TEST_HEADER include/spdk/bdev.h 00:03:39.687 TEST_HEADER include/spdk/bdev_module.h 00:03:39.687 CC app/spdk_top/spdk_top.o 00:03:39.687 TEST_HEADER include/spdk/bdev_zone.h 00:03:39.687 TEST_HEADER include/spdk/bit_array.h 00:03:39.687 TEST_HEADER include/spdk/bit_pool.h 00:03:39.687 CC app/spdk_nvme_perf/perf.o 00:03:39.687 TEST_HEADER include/spdk/blob_bdev.h 00:03:39.687 CC app/spdk_lspci/spdk_lspci.o 00:03:39.687 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:39.687 TEST_HEADER include/spdk/blobfs.h 00:03:39.687 TEST_HEADER include/spdk/conf.h 00:03:39.687 CC app/spdk_nvme_discover/discovery_aer.o 00:03:39.687 TEST_HEADER include/spdk/blob.h 00:03:39.687 TEST_HEADER include/spdk/config.h 00:03:39.687 TEST_HEADER include/spdk/cpuset.h 00:03:39.687 CC app/spdk_nvme_identify/identify.o 00:03:39.687 TEST_HEADER include/spdk/crc16.h 00:03:39.687 TEST_HEADER include/spdk/crc32.h 00:03:39.687 TEST_HEADER include/spdk/crc64.h 00:03:39.687 TEST_HEADER include/spdk/dif.h 00:03:39.687 TEST_HEADER include/spdk/dma.h 00:03:39.687 TEST_HEADER include/spdk/endian.h 00:03:39.687 TEST_HEADER include/spdk/env_dpdk.h 00:03:39.687 TEST_HEADER include/spdk/env.h 00:03:39.687 TEST_HEADER include/spdk/event.h 00:03:39.687 TEST_HEADER include/spdk/fd_group.h 00:03:39.687 TEST_HEADER include/spdk/fd.h 00:03:39.687 TEST_HEADER include/spdk/file.h 00:03:39.687 TEST_HEADER include/spdk/ftl.h 00:03:39.687 TEST_HEADER include/spdk/gpt_spec.h 00:03:39.687 TEST_HEADER include/spdk/hexlify.h 00:03:39.687 TEST_HEADER include/spdk/histogram_data.h 00:03:39.687 TEST_HEADER include/spdk/idxd.h 00:03:39.687 CC app/nvmf_tgt/nvmf_main.o 00:03:39.687 TEST_HEADER include/spdk/idxd_spec.h 00:03:39.687 TEST_HEADER include/spdk/init.h 00:03:39.687 TEST_HEADER include/spdk/ioat.h 00:03:39.687 TEST_HEADER include/spdk/ioat_spec.h 00:03:39.687 CC app/iscsi_tgt/iscsi_tgt.o 00:03:39.687 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:39.687 TEST_HEADER include/spdk/iscsi_spec.h 00:03:39.687 TEST_HEADER include/spdk/json.h 00:03:39.687 TEST_HEADER include/spdk/jsonrpc.h 00:03:39.687 CC test/app/histogram_perf/histogram_perf.o 00:03:39.687 TEST_HEADER include/spdk/likely.h 00:03:39.687 CC app/spdk_dd/spdk_dd.o 00:03:39.687 TEST_HEADER include/spdk/log.h 00:03:39.687 TEST_HEADER include/spdk/lvol.h 00:03:39.687 TEST_HEADER include/spdk/memory.h 00:03:39.687 CC test/app/stub/stub.o 00:03:39.687 TEST_HEADER include/spdk/mmio.h 00:03:39.687 TEST_HEADER include/spdk/nbd.h 00:03:39.687 CC test/app/jsoncat/jsoncat.o 00:03:39.687 TEST_HEADER include/spdk/notify.h 00:03:39.687 CC app/vhost/vhost.o 00:03:39.687 CC test/event/reactor/reactor.o 00:03:39.687 TEST_HEADER include/spdk/nvme.h 00:03:39.687 CC test/env/vtophys/vtophys.o 00:03:39.687 CC test/nvme/sgl/sgl.o 00:03:39.687 CC test/env/pci/pci_ut.o 00:03:39.687 TEST_HEADER include/spdk/nvme_intel.h 00:03:39.687 CC test/nvme/e2edp/nvme_dp.o 00:03:39.687 CC test/env/memory/memory_ut.o 00:03:39.687 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:39.687 CC test/nvme/simple_copy/simple_copy.o 00:03:39.687 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:39.687 CC test/nvme/reset/reset.o 00:03:39.687 CC test/event/event_perf/event_perf.o 00:03:39.687 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:39.687 CC examples/nvme/reconnect/reconnect.o 00:03:39.687 CC test/nvme/overhead/overhead.o 00:03:39.687 CC test/nvme/aer/aer.o 00:03:39.687 CC examples/nvme/hello_world/hello_world.o 00:03:39.687 TEST_HEADER include/spdk/nvme_spec.h 00:03:39.687 CC test/nvme/fused_ordering/fused_ordering.o 00:03:39.687 CC test/nvme/connect_stress/connect_stress.o 00:03:39.687 CC test/nvme/startup/startup.o 00:03:39.687 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:39.687 CC test/thread/poller_perf/poller_perf.o 00:03:39.687 CC test/nvme/err_injection/err_injection.o 00:03:39.687 TEST_HEADER include/spdk/nvme_zns.h 00:03:39.687 CC examples/nvme/arbitration/arbitration.o 00:03:39.687 CC examples/ioat/perf/perf.o 00:03:39.687 CC test/nvme/compliance/nvme_compliance.o 00:03:39.687 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:39.687 CC examples/idxd/perf/perf.o 00:03:39.687 CC test/nvme/boot_partition/boot_partition.o 00:03:39.687 CC test/nvme/cuse/cuse.o 00:03:39.687 CC app/fio/nvme/fio_plugin.o 00:03:39.687 CC examples/nvme/hotplug/hotplug.o 00:03:39.687 CC app/spdk_tgt/spdk_tgt.o 00:03:39.687 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:39.687 CC test/event/reactor_perf/reactor_perf.o 00:03:39.687 CC examples/util/zipf/zipf.o 00:03:39.687 CC test/nvme/reserve/reserve.o 00:03:39.687 CC test/thread/lock/spdk_lock.o 00:03:39.687 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:39.687 CC examples/accel/perf/accel_perf.o 00:03:39.687 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:39.687 TEST_HEADER include/spdk/nvmf.h 00:03:39.687 TEST_HEADER include/spdk/nvmf_spec.h 00:03:39.687 CC test/nvme/fdp/fdp.o 00:03:39.687 CC examples/sock/hello_world/hello_sock.o 00:03:39.687 CC examples/vmd/lsvmd/lsvmd.o 00:03:39.687 TEST_HEADER include/spdk/nvmf_transport.h 00:03:39.687 CC test/event/app_repeat/app_repeat.o 00:03:39.687 TEST_HEADER include/spdk/opal.h 00:03:39.687 TEST_HEADER include/spdk/opal_spec.h 00:03:39.687 CC test/app/bdev_svc/bdev_svc.o 00:03:39.687 TEST_HEADER include/spdk/pci_ids.h 00:03:39.687 TEST_HEADER include/spdk/pipe.h 00:03:39.687 TEST_HEADER include/spdk/queue.h 00:03:39.687 TEST_HEADER include/spdk/reduce.h 00:03:39.687 TEST_HEADER include/spdk/rpc.h 00:03:39.687 TEST_HEADER include/spdk/scheduler.h 00:03:39.687 TEST_HEADER include/spdk/scsi.h 00:03:39.687 CC test/bdev/bdevio/bdevio.o 00:03:39.687 CC test/accel/dif/dif.o 00:03:39.687 TEST_HEADER include/spdk/scsi_spec.h 00:03:39.687 TEST_HEADER include/spdk/sock.h 00:03:39.687 TEST_HEADER include/spdk/stdinc.h 00:03:39.687 CC examples/nvmf/nvmf/nvmf.o 00:03:39.687 TEST_HEADER include/spdk/string.h 00:03:39.687 CC examples/blob/cli/blobcli.o 00:03:39.688 TEST_HEADER include/spdk/thread.h 00:03:39.688 LINK spdk_lspci 00:03:39.688 CC examples/bdev/bdevperf/bdevperf.o 00:03:39.688 CC examples/blob/hello_world/hello_blob.o 00:03:39.688 CC test/dma/test_dma/test_dma.o 00:03:39.688 CC test/event/scheduler/scheduler.o 00:03:39.688 TEST_HEADER include/spdk/trace.h 00:03:39.688 TEST_HEADER include/spdk/trace_parser.h 00:03:39.688 CC examples/bdev/hello_world/hello_bdev.o 00:03:39.688 CC test/blobfs/mkfs/mkfs.o 00:03:39.688 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:39.688 TEST_HEADER include/spdk/tree.h 00:03:39.688 CC examples/thread/thread/thread_ex.o 00:03:39.688 TEST_HEADER include/spdk/ublk.h 00:03:39.688 CC test/env/mem_callbacks/mem_callbacks.o 00:03:39.688 TEST_HEADER include/spdk/util.h 00:03:39.688 TEST_HEADER include/spdk/uuid.h 00:03:39.688 TEST_HEADER include/spdk/version.h 00:03:39.948 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:39.948 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:39.948 TEST_HEADER include/spdk/vhost.h 00:03:39.948 LINK rpc_client_test 00:03:39.948 TEST_HEADER include/spdk/vmd.h 00:03:39.948 CC test/lvol/esnap/esnap.o 00:03:39.948 TEST_HEADER include/spdk/xor.h 00:03:39.948 TEST_HEADER include/spdk/zipf.h 00:03:39.948 CXX test/cpp_headers/accel.o 00:03:39.948 LINK spdk_nvme_discover 00:03:39.948 LINK nvmf_tgt 00:03:39.948 LINK histogram_perf 00:03:39.948 LINK jsoncat 00:03:39.948 LINK reactor 00:03:39.948 LINK spdk_trace_record 00:03:39.948 LINK vtophys 00:03:39.948 LINK event_perf 00:03:39.948 LINK startup 00:03:39.948 LINK interrupt_tgt 00:03:39.948 LINK reactor_perf 00:03:39.948 LINK poller_perf 00:03:39.948 LINK lsvmd 00:03:39.948 LINK stub 00:03:39.948 LINK err_injection 00:03:39.948 LINK zipf 00:03:39.948 LINK env_dpdk_post_init 00:03:39.948 LINK hello_world 00:03:39.948 LINK app_repeat 00:03:39.948 LINK ioat_perf 00:03:39.948 LINK iscsi_tgt 00:03:39.948 LINK connect_stress 00:03:39.948 LINK bdev_svc 00:03:39.948 LINK fused_ordering 00:03:39.948 LINK doorbell_aers 00:03:39.948 LINK reserve 00:03:39.948 LINK vhost 00:03:39.948 fio_plugin.c:1491:29: warning: field 'ruhs' with variable sized type 'struct spdk_nvme_fdp_ruhs' not at the end of a struct or class is a GNU extension [-Wgnu-variable-sized-type-not-at-end] 00:03:39.948 struct spdk_nvme_fdp_ruhs ruhs; 00:03:39.948 ^ 00:03:39.948 LINK boot_partition 00:03:39.948 LINK cmb_copy 00:03:40.214 LINK hotplug 00:03:40.214 LINK simple_copy 00:03:40.214 LINK spdk_tgt 00:03:40.214 CXX test/cpp_headers/accel_module.o 00:03:40.214 LINK nvme_dp 00:03:40.214 LINK reset 00:03:40.214 LINK scheduler 00:03:40.214 LINK hello_blob 00:03:40.214 LINK mkfs 00:03:40.214 LINK sgl 00:03:40.214 LINK aer 00:03:40.214 LINK hello_sock 00:03:40.214 LINK overhead 00:03:40.214 LINK idxd_perf 00:03:40.214 LINK fdp 00:03:40.214 LINK spdk_trace 00:03:40.214 LINK thread 00:03:40.214 LINK hello_bdev 00:03:40.214 LINK reconnect 00:03:40.214 LINK spdk_dd 00:03:40.214 LINK nvmf 00:03:40.214 LINK arbitration 00:03:40.478 LINK test_dma 00:03:40.478 LINK nvme_manage 00:03:40.478 LINK accel_perf 00:03:40.478 LINK pci_ut 00:03:40.478 LINK bdevio 00:03:40.478 CXX test/cpp_headers/assert.o 00:03:40.478 LINK dif 00:03:40.478 LINK nvme_fuzz 00:03:40.478 LINK nvme_compliance 00:03:40.478 LINK blobcli 00:03:40.478 1 warning generated. 00:03:40.478 CXX test/cpp_headers/barrier.o 00:03:40.738 LINK spdk_nvme 00:03:40.738 LINK mem_callbacks 00:03:40.738 LINK spdk_nvme_identify 00:03:40.738 CC examples/ioat/verify/verify.o 00:03:40.738 CXX test/cpp_headers/base64.o 00:03:40.738 CC examples/vmd/led/led.o 00:03:40.738 CXX test/cpp_headers/bdev.o 00:03:40.738 LINK spdk_nvme_perf 00:03:40.999 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:40.999 CC app/fio/bdev/fio_plugin.o 00:03:40.999 CC examples/nvme/abort/abort.o 00:03:40.999 LINK cuse 00:03:40.999 LINK bdevperf 00:03:40.999 CXX test/cpp_headers/bdev_module.o 00:03:40.999 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:40.999 LINK spdk_top 00:03:40.999 CXX test/cpp_headers/bdev_zone.o 00:03:40.999 CXX test/cpp_headers/bit_array.o 00:03:40.999 LINK memory_ut 00:03:40.999 CXX test/cpp_headers/bit_pool.o 00:03:40.999 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:03:40.999 LINK led 00:03:40.999 LINK verify 00:03:41.260 CXX test/cpp_headers/blob_bdev.o 00:03:41.260 CXX test/cpp_headers/blobfs_bdev.o 00:03:41.260 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:03:41.261 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:41.261 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:41.261 CXX test/cpp_headers/blobfs.o 00:03:41.261 CXX test/cpp_headers/blob.o 00:03:41.261 CXX test/cpp_headers/conf.o 00:03:41.261 CXX test/cpp_headers/config.o 00:03:41.261 CXX test/cpp_headers/cpuset.o 00:03:41.261 LINK pmr_persistence 00:03:41.261 CXX test/cpp_headers/crc16.o 00:03:41.261 CXX test/cpp_headers/crc32.o 00:03:41.261 CXX test/cpp_headers/crc64.o 00:03:41.261 CXX test/cpp_headers/dif.o 00:03:41.261 CXX test/cpp_headers/dma.o 00:03:41.261 CXX test/cpp_headers/endian.o 00:03:41.261 CXX test/cpp_headers/env_dpdk.o 00:03:41.521 CXX test/cpp_headers/env.o 00:03:41.521 CXX test/cpp_headers/event.o 00:03:41.521 CXX test/cpp_headers/fd_group.o 00:03:41.521 CXX test/cpp_headers/fd.o 00:03:41.521 CXX test/cpp_headers/file.o 00:03:41.521 CXX test/cpp_headers/ftl.o 00:03:41.521 LINK abort 00:03:41.521 CXX test/cpp_headers/gpt_spec.o 00:03:41.521 CXX test/cpp_headers/hexlify.o 00:03:41.521 CXX test/cpp_headers/histogram_data.o 00:03:41.521 CXX test/cpp_headers/idxd.o 00:03:41.521 CXX test/cpp_headers/idxd_spec.o 00:03:41.521 CXX test/cpp_headers/init.o 00:03:41.521 CXX test/cpp_headers/ioat.o 00:03:41.521 CXX test/cpp_headers/ioat_spec.o 00:03:41.521 CXX test/cpp_headers/iscsi_spec.o 00:03:41.521 CXX test/cpp_headers/json.o 00:03:41.521 CXX test/cpp_headers/jsonrpc.o 00:03:41.521 CXX test/cpp_headers/likely.o 00:03:41.521 CXX test/cpp_headers/log.o 00:03:41.521 CXX test/cpp_headers/lvol.o 00:03:41.521 CXX test/cpp_headers/memory.o 00:03:41.521 CXX test/cpp_headers/mmio.o 00:03:41.521 CXX test/cpp_headers/nbd.o 00:03:41.521 CXX test/cpp_headers/notify.o 00:03:41.521 CXX test/cpp_headers/nvme.o 00:03:41.521 CXX test/cpp_headers/nvme_intel.o 00:03:41.521 LINK spdk_bdev 00:03:41.521 CXX test/cpp_headers/nvme_ocssd.o 00:03:41.780 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:41.780 CXX test/cpp_headers/nvme_spec.o 00:03:41.780 CXX test/cpp_headers/nvme_zns.o 00:03:41.780 CXX test/cpp_headers/nvmf_cmd.o 00:03:41.780 LINK llvm_vfio_fuzz 00:03:41.780 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:41.780 CXX test/cpp_headers/nvmf.o 00:03:41.780 CXX test/cpp_headers/nvmf_spec.o 00:03:41.780 CXX test/cpp_headers/nvmf_transport.o 00:03:41.780 CXX test/cpp_headers/opal.o 00:03:41.780 CXX test/cpp_headers/opal_spec.o 00:03:41.780 CXX test/cpp_headers/pci_ids.o 00:03:41.780 CXX test/cpp_headers/pipe.o 00:03:41.780 CXX test/cpp_headers/queue.o 00:03:41.780 CXX test/cpp_headers/reduce.o 00:03:41.780 CXX test/cpp_headers/rpc.o 00:03:41.780 CXX test/cpp_headers/scheduler.o 00:03:41.780 CXX test/cpp_headers/scsi.o 00:03:41.780 LINK vhost_fuzz 00:03:41.780 CXX test/cpp_headers/scsi_spec.o 00:03:41.780 CXX test/cpp_headers/sock.o 00:03:41.780 CXX test/cpp_headers/stdinc.o 00:03:41.780 CXX test/cpp_headers/string.o 00:03:41.780 CXX test/cpp_headers/thread.o 00:03:41.780 CXX test/cpp_headers/trace.o 00:03:41.780 CXX test/cpp_headers/trace_parser.o 00:03:41.780 CXX test/cpp_headers/tree.o 00:03:41.780 CXX test/cpp_headers/ublk.o 00:03:41.780 CXX test/cpp_headers/util.o 00:03:41.780 CXX test/cpp_headers/uuid.o 00:03:41.780 CXX test/cpp_headers/version.o 00:03:41.780 CXX test/cpp_headers/vfio_user_pci.o 00:03:41.780 CXX test/cpp_headers/vfio_user_spec.o 00:03:41.780 CXX test/cpp_headers/vhost.o 00:03:41.780 CXX test/cpp_headers/vmd.o 00:03:41.780 CXX test/cpp_headers/xor.o 00:03:41.780 CXX test/cpp_headers/zipf.o 00:03:42.038 LINK llvm_nvme_fuzz 00:03:42.038 LINK spdk_lock 00:03:42.605 LINK iscsi_fuzz 00:03:45.138 LINK esnap 00:03:45.397 00:03:45.397 real 0m30.533s 00:03:45.397 user 6m1.635s 00:03:45.397 sys 1m58.469s 00:03:45.397 13:16:04 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:03:45.397 13:16:04 -- common/autotest_common.sh@10 -- $ set +x 00:03:45.397 ************************************ 00:03:45.397 END TEST make 00:03:45.397 ************************************ 00:03:45.656 13:16:04 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:03:45.656 13:16:04 -- nvmf/common.sh@7 -- # uname -s 00:03:45.656 13:16:04 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:45.656 13:16:04 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:45.656 13:16:04 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:45.656 13:16:04 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:45.656 13:16:04 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:45.656 13:16:04 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:45.656 13:16:04 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:45.656 13:16:04 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:45.656 13:16:04 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:45.656 13:16:04 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:45.656 13:16:04 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8023d868-666a-e711-906e-0017a4403562 00:03:45.656 13:16:04 -- nvmf/common.sh@18 -- # NVME_HOSTID=8023d868-666a-e711-906e-0017a4403562 00:03:45.656 13:16:04 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:45.656 13:16:04 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:45.656 13:16:04 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:45.656 13:16:04 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:03:45.656 13:16:04 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:45.656 13:16:04 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:45.656 13:16:04 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:45.656 13:16:04 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:45.656 13:16:04 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:45.656 13:16:04 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:45.656 13:16:04 -- paths/export.sh@5 -- # export PATH 00:03:45.656 13:16:04 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:45.656 13:16:04 -- nvmf/common.sh@46 -- # : 0 00:03:45.656 13:16:04 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:03:45.656 13:16:04 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:03:45.656 13:16:04 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:03:45.656 13:16:04 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:45.656 13:16:04 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:45.656 13:16:04 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:03:45.656 13:16:04 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:03:45.656 13:16:04 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:03:45.656 13:16:04 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:45.656 13:16:04 -- spdk/autotest.sh@32 -- # uname -s 00:03:45.656 13:16:04 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:45.656 13:16:04 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:45.656 13:16:04 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:45.656 13:16:04 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:45.656 13:16:04 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:45.656 13:16:04 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:45.656 13:16:04 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:45.656 13:16:04 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:45.656 13:16:04 -- spdk/autotest.sh@48 -- # udevadm_pid=3094359 00:03:45.656 13:16:04 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:45.656 13:16:04 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:45.656 13:16:04 -- spdk/autotest.sh@54 -- # echo 3094361 00:03:45.656 13:16:04 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:45.656 13:16:04 -- spdk/autotest.sh@56 -- # echo 3094362 00:03:45.656 13:16:04 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:45.656 13:16:04 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:03:45.656 13:16:04 -- spdk/autotest.sh@60 -- # echo 3094363 00:03:45.656 13:16:04 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:45.656 13:16:04 -- spdk/autotest.sh@62 -- # echo 3094364 00:03:45.656 13:16:04 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:45.656 13:16:04 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:45.656 13:16:04 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:03:45.656 13:16:04 -- common/autotest_common.sh@712 -- # xtrace_disable 00:03:45.656 13:16:04 -- common/autotest_common.sh@10 -- # set +x 00:03:45.656 13:16:04 -- spdk/autotest.sh@70 -- # create_test_list 00:03:45.656 13:16:04 -- common/autotest_common.sh@736 -- # xtrace_disable 00:03:45.656 13:16:04 -- common/autotest_common.sh@10 -- # set +x 00:03:45.656 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:03:45.656 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:03:45.656 13:16:04 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:03:45.656 13:16:04 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:45.656 13:16:04 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:45.656 13:16:04 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:03:45.656 13:16:04 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:45.656 13:16:04 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:03:45.656 13:16:04 -- common/autotest_common.sh@1440 -- # uname 00:03:45.656 13:16:04 -- common/autotest_common.sh@1440 -- # '[' Linux = FreeBSD ']' 00:03:45.656 13:16:04 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:03:45.656 13:16:04 -- common/autotest_common.sh@1460 -- # uname 00:03:45.656 13:16:04 -- common/autotest_common.sh@1460 -- # [[ Linux = FreeBSD ]] 00:03:45.656 13:16:04 -- spdk/autotest.sh@82 -- # grep CC_TYPE mk/cc.mk 00:03:45.656 13:16:04 -- spdk/autotest.sh@82 -- # CC_TYPE=CC_TYPE=clang 00:03:45.656 13:16:04 -- spdk/autotest.sh@83 -- # hash lcov 00:03:45.656 13:16:04 -- spdk/autotest.sh@83 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:03:45.656 13:16:04 -- spdk/autotest.sh@100 -- # timing_enter pre_cleanup 00:03:45.656 13:16:04 -- common/autotest_common.sh@712 -- # xtrace_disable 00:03:45.656 13:16:04 -- common/autotest_common.sh@10 -- # set +x 00:03:45.656 13:16:04 -- spdk/autotest.sh@102 -- # rm -f 00:03:45.656 13:16:04 -- spdk/autotest.sh@105 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:49.848 0000:1a:00.0 (8086 0a54): Already using the nvme driver 00:03:49.848 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:49.848 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:49.848 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:49.848 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:49.848 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:49.848 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:50.107 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:50.107 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:50.107 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:50.107 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:50.107 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:50.107 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:50.107 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:50.107 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:50.107 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:50.365 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:52.297 13:16:10 -- spdk/autotest.sh@107 -- # get_zoned_devs 00:03:52.297 13:16:10 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:52.297 13:16:10 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:52.297 13:16:10 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:52.297 13:16:10 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:52.297 13:16:10 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:52.297 13:16:10 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:52.297 13:16:10 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:52.297 13:16:10 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:52.297 13:16:10 -- spdk/autotest.sh@109 -- # (( 0 > 0 )) 00:03:52.297 13:16:10 -- spdk/autotest.sh@121 -- # ls /dev/nvme0n1 00:03:52.297 13:16:10 -- spdk/autotest.sh@121 -- # grep -v p 00:03:52.297 13:16:10 -- spdk/autotest.sh@121 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:03:52.297 13:16:10 -- spdk/autotest.sh@123 -- # [[ -z '' ]] 00:03:52.297 13:16:10 -- spdk/autotest.sh@124 -- # block_in_use /dev/nvme0n1 00:03:52.297 13:16:10 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:03:52.297 13:16:10 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:52.297 No valid GPT data, bailing 00:03:52.297 13:16:11 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:52.297 13:16:11 -- scripts/common.sh@393 -- # pt= 00:03:52.297 13:16:11 -- scripts/common.sh@394 -- # return 1 00:03:52.297 13:16:11 -- spdk/autotest.sh@125 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:52.297 1+0 records in 00:03:52.297 1+0 records out 00:03:52.297 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00704815 s, 149 MB/s 00:03:52.297 13:16:11 -- spdk/autotest.sh@129 -- # sync 00:03:52.297 13:16:11 -- spdk/autotest.sh@131 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:52.297 13:16:11 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:52.297 13:16:11 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:57.572 13:16:15 -- spdk/autotest.sh@135 -- # uname -s 00:03:57.572 13:16:15 -- spdk/autotest.sh@135 -- # '[' Linux = Linux ']' 00:03:57.572 13:16:15 -- spdk/autotest.sh@136 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:57.572 13:16:15 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:57.572 13:16:15 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:57.572 13:16:15 -- common/autotest_common.sh@10 -- # set +x 00:03:57.572 ************************************ 00:03:57.572 START TEST setup.sh 00:03:57.572 ************************************ 00:03:57.572 13:16:15 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:57.572 * Looking for test storage... 00:03:57.572 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:57.572 13:16:15 -- setup/test-setup.sh@10 -- # uname -s 00:03:57.572 13:16:15 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:57.572 13:16:15 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:57.572 13:16:15 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:57.572 13:16:15 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:57.572 13:16:15 -- common/autotest_common.sh@10 -- # set +x 00:03:57.572 ************************************ 00:03:57.572 START TEST acl 00:03:57.572 ************************************ 00:03:57.572 13:16:15 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:57.572 * Looking for test storage... 00:03:57.572 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:57.572 13:16:15 -- setup/acl.sh@10 -- # get_zoned_devs 00:03:57.572 13:16:15 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:57.572 13:16:15 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:57.572 13:16:15 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:57.572 13:16:15 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:57.572 13:16:15 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:57.572 13:16:15 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:57.572 13:16:15 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:57.572 13:16:15 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:57.572 13:16:15 -- setup/acl.sh@12 -- # devs=() 00:03:57.572 13:16:15 -- setup/acl.sh@12 -- # declare -a devs 00:03:57.572 13:16:15 -- setup/acl.sh@13 -- # drivers=() 00:03:57.572 13:16:15 -- setup/acl.sh@13 -- # declare -A drivers 00:03:57.572 13:16:15 -- setup/acl.sh@51 -- # setup reset 00:03:57.572 13:16:15 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:57.572 13:16:15 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:04.139 13:16:22 -- setup/acl.sh@52 -- # collect_setup_devs 00:04:04.139 13:16:22 -- setup/acl.sh@16 -- # local dev driver 00:04:04.139 13:16:22 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:04.139 13:16:22 -- setup/acl.sh@15 -- # setup output status 00:04:04.139 13:16:22 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:04.139 13:16:22 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:07.425 Hugepages 00:04:07.425 node hugesize free / total 00:04:07.425 13:16:26 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:07.425 13:16:26 -- setup/acl.sh@19 -- # continue 00:04:07.425 13:16:26 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:07.425 13:16:26 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:07.425 13:16:26 -- setup/acl.sh@19 -- # continue 00:04:07.425 13:16:26 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:07.425 13:16:26 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:07.425 13:16:26 -- setup/acl.sh@19 -- # continue 00:04:07.425 13:16:26 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:07.425 00:04:07.425 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:07.425 13:16:26 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:07.425 13:16:26 -- setup/acl.sh@19 -- # continue 00:04:07.425 13:16:26 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:07.425 13:16:26 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:07.425 13:16:26 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:07.425 13:16:26 -- setup/acl.sh@20 -- # continue 00:04:07.425 13:16:26 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:07.425 13:16:26 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:07.425 13:16:26 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:07.425 13:16:26 -- setup/acl.sh@20 -- # continue 00:04:07.425 13:16:26 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:07.425 13:16:26 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:07.425 13:16:26 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:07.425 13:16:26 -- setup/acl.sh@20 -- # continue 00:04:07.425 13:16:26 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:07.425 13:16:26 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:07.425 13:16:26 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:07.425 13:16:26 -- setup/acl.sh@20 -- # continue 00:04:07.425 13:16:26 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:07.425 13:16:26 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:07.425 13:16:26 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:07.425 13:16:26 -- setup/acl.sh@20 -- # continue 00:04:07.425 13:16:26 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:07.684 13:16:26 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # continue 00:04:07.684 13:16:26 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:07.684 13:16:26 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # continue 00:04:07.684 13:16:26 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:07.684 13:16:26 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # continue 00:04:07.684 13:16:26 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:07.684 13:16:26 -- setup/acl.sh@19 -- # [[ 0000:1a:00.0 == *:*:*.* ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:07.684 13:16:26 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\1\a\:\0\0\.\0* ]] 00:04:07.684 13:16:26 -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:07.684 13:16:26 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:07.684 13:16:26 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:07.684 13:16:26 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # continue 00:04:07.684 13:16:26 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:07.684 13:16:26 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # continue 00:04:07.684 13:16:26 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:07.684 13:16:26 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # continue 00:04:07.684 13:16:26 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:07.684 13:16:26 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # continue 00:04:07.684 13:16:26 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:07.684 13:16:26 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # continue 00:04:07.684 13:16:26 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:07.684 13:16:26 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # continue 00:04:07.684 13:16:26 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:07.684 13:16:26 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # continue 00:04:07.684 13:16:26 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:07.684 13:16:26 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:07.684 13:16:26 -- setup/acl.sh@20 -- # continue 00:04:07.684 13:16:26 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:07.684 13:16:26 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:07.684 13:16:26 -- setup/acl.sh@54 -- # run_test denied denied 00:04:07.684 13:16:26 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:07.684 13:16:26 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:07.684 13:16:26 -- common/autotest_common.sh@10 -- # set +x 00:04:07.684 ************************************ 00:04:07.684 START TEST denied 00:04:07.684 ************************************ 00:04:07.684 13:16:26 -- common/autotest_common.sh@1104 -- # denied 00:04:07.684 13:16:26 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:1a:00.0' 00:04:07.684 13:16:26 -- setup/acl.sh@38 -- # setup output config 00:04:07.684 13:16:26 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:1a:00.0' 00:04:07.684 13:16:26 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:07.684 13:16:26 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:14.247 0000:1a:00.0 (8086 0a54): Skipping denied controller at 0000:1a:00.0 00:04:14.247 13:16:32 -- setup/acl.sh@40 -- # verify 0000:1a:00.0 00:04:14.247 13:16:32 -- setup/acl.sh@28 -- # local dev driver 00:04:14.247 13:16:32 -- setup/acl.sh@30 -- # for dev in "$@" 00:04:14.247 13:16:32 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:1a:00.0 ]] 00:04:14.247 13:16:32 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:1a:00.0/driver 00:04:14.247 13:16:32 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:14.247 13:16:32 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:14.247 13:16:32 -- setup/acl.sh@41 -- # setup reset 00:04:14.247 13:16:32 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:14.247 13:16:32 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:22.368 00:04:22.368 real 0m13.446s 00:04:22.368 user 0m4.122s 00:04:22.368 sys 0m8.545s 00:04:22.368 13:16:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:22.368 13:16:39 -- common/autotest_common.sh@10 -- # set +x 00:04:22.368 ************************************ 00:04:22.368 END TEST denied 00:04:22.368 ************************************ 00:04:22.368 13:16:39 -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:22.368 13:16:39 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:22.368 13:16:39 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:22.368 13:16:39 -- common/autotest_common.sh@10 -- # set +x 00:04:22.368 ************************************ 00:04:22.368 START TEST allowed 00:04:22.368 ************************************ 00:04:22.368 13:16:39 -- common/autotest_common.sh@1104 -- # allowed 00:04:22.368 13:16:39 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:1a:00.0 00:04:22.368 13:16:39 -- setup/acl.sh@45 -- # setup output config 00:04:22.368 13:16:39 -- setup/acl.sh@46 -- # grep -E '0000:1a:00.0 .*: nvme -> .*' 00:04:22.368 13:16:39 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:22.368 13:16:39 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:30.525 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:04:30.525 13:16:49 -- setup/acl.sh@47 -- # verify 00:04:30.525 13:16:49 -- setup/acl.sh@28 -- # local dev driver 00:04:30.525 13:16:49 -- setup/acl.sh@48 -- # setup reset 00:04:30.525 13:16:49 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:30.525 13:16:49 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:37.101 00:04:37.101 real 0m15.911s 00:04:37.101 user 0m4.233s 00:04:37.101 sys 0m8.538s 00:04:37.101 13:16:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:37.101 13:16:55 -- common/autotest_common.sh@10 -- # set +x 00:04:37.101 ************************************ 00:04:37.101 END TEST allowed 00:04:37.101 ************************************ 00:04:37.101 00:04:37.101 real 0m40.099s 00:04:37.101 user 0m12.096s 00:04:37.101 sys 0m24.379s 00:04:37.101 13:16:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:37.101 13:16:55 -- common/autotest_common.sh@10 -- # set +x 00:04:37.101 ************************************ 00:04:37.101 END TEST acl 00:04:37.101 ************************************ 00:04:37.101 13:16:55 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:37.101 13:16:55 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:37.101 13:16:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:37.101 13:16:55 -- common/autotest_common.sh@10 -- # set +x 00:04:37.101 ************************************ 00:04:37.101 START TEST hugepages 00:04:37.101 ************************************ 00:04:37.101 13:16:55 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:37.361 * Looking for test storage... 00:04:37.361 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:37.361 13:16:56 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:37.361 13:16:56 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:37.361 13:16:56 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:37.361 13:16:56 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:37.361 13:16:56 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:37.361 13:16:56 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:37.361 13:16:56 -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:37.361 13:16:56 -- setup/common.sh@18 -- # local node= 00:04:37.361 13:16:56 -- setup/common.sh@19 -- # local var val 00:04:37.361 13:16:56 -- setup/common.sh@20 -- # local mem_f mem 00:04:37.361 13:16:56 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:37.361 13:16:56 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:37.361 13:16:56 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:37.361 13:16:56 -- setup/common.sh@28 -- # mapfile -t mem 00:04:37.361 13:16:56 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 66379600 kB' 'MemAvailable: 70359840 kB' 'Buffers: 9896 kB' 'Cached: 18226508 kB' 'SwapCached: 0 kB' 'Active: 15021576 kB' 'Inactive: 3731776 kB' 'Active(anon): 14463448 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 520260 kB' 'Mapped: 203280 kB' 'Shmem: 13946500 kB' 'KReclaimable: 526520 kB' 'Slab: 950756 kB' 'SReclaimable: 526520 kB' 'SUnreclaim: 424236 kB' 'KernelStack: 16288 kB' 'PageTables: 8940 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438216 kB' 'Committed_AS: 15841872 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214168 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.361 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.361 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # continue 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:37.362 13:16:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:37.362 13:16:56 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:37.362 13:16:56 -- setup/common.sh@33 -- # echo 2048 00:04:37.362 13:16:56 -- setup/common.sh@33 -- # return 0 00:04:37.362 13:16:56 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:37.362 13:16:56 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:37.362 13:16:56 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:37.362 13:16:56 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:37.362 13:16:56 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:37.362 13:16:56 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:37.362 13:16:56 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:37.362 13:16:56 -- setup/hugepages.sh@207 -- # get_nodes 00:04:37.362 13:16:56 -- setup/hugepages.sh@27 -- # local node 00:04:37.362 13:16:56 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:37.362 13:16:56 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:04:37.362 13:16:56 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:37.362 13:16:56 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:37.362 13:16:56 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:37.362 13:16:56 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:37.362 13:16:56 -- setup/hugepages.sh@208 -- # clear_hp 00:04:37.362 13:16:56 -- setup/hugepages.sh@37 -- # local node hp 00:04:37.362 13:16:56 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:37.362 13:16:56 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:37.362 13:16:56 -- setup/hugepages.sh@41 -- # echo 0 00:04:37.362 13:16:56 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:37.362 13:16:56 -- setup/hugepages.sh@41 -- # echo 0 00:04:37.362 13:16:56 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:37.362 13:16:56 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:37.362 13:16:56 -- setup/hugepages.sh@41 -- # echo 0 00:04:37.362 13:16:56 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:37.362 13:16:56 -- setup/hugepages.sh@41 -- # echo 0 00:04:37.362 13:16:56 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:37.362 13:16:56 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:37.362 13:16:56 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:37.362 13:16:56 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:37.362 13:16:56 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:37.362 13:16:56 -- common/autotest_common.sh@10 -- # set +x 00:04:37.362 ************************************ 00:04:37.362 START TEST default_setup 00:04:37.362 ************************************ 00:04:37.362 13:16:56 -- common/autotest_common.sh@1104 -- # default_setup 00:04:37.362 13:16:56 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:37.362 13:16:56 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:37.362 13:16:56 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:37.362 13:16:56 -- setup/hugepages.sh@51 -- # shift 00:04:37.362 13:16:56 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:37.362 13:16:56 -- setup/hugepages.sh@52 -- # local node_ids 00:04:37.362 13:16:56 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:37.362 13:16:56 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:37.362 13:16:56 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:37.362 13:16:56 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:37.362 13:16:56 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:37.362 13:16:56 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:37.363 13:16:56 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:37.363 13:16:56 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:37.363 13:16:56 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:37.363 13:16:56 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:37.363 13:16:56 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:37.363 13:16:56 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:37.363 13:16:56 -- setup/hugepages.sh@73 -- # return 0 00:04:37.363 13:16:56 -- setup/hugepages.sh@137 -- # setup output 00:04:37.363 13:16:56 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:37.363 13:16:56 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:41.551 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:41.551 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:41.551 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:41.551 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:41.551 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:41.551 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:41.551 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:41.551 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:41.551 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:41.551 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:41.551 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:41.551 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:41.551 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:41.551 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:41.551 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:41.551 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:44.839 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:04:46.220 13:17:04 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:46.220 13:17:04 -- setup/hugepages.sh@89 -- # local node 00:04:46.220 13:17:04 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:46.220 13:17:04 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:46.220 13:17:04 -- setup/hugepages.sh@92 -- # local surp 00:04:46.220 13:17:04 -- setup/hugepages.sh@93 -- # local resv 00:04:46.220 13:17:04 -- setup/hugepages.sh@94 -- # local anon 00:04:46.220 13:17:04 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:46.220 13:17:04 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:46.220 13:17:04 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:46.220 13:17:04 -- setup/common.sh@18 -- # local node= 00:04:46.220 13:17:04 -- setup/common.sh@19 -- # local var val 00:04:46.220 13:17:04 -- setup/common.sh@20 -- # local mem_f mem 00:04:46.220 13:17:04 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.220 13:17:04 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.220 13:17:04 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.220 13:17:04 -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.220 13:17:04 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.220 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.220 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68530704 kB' 'MemAvailable: 72510784 kB' 'Buffers: 9896 kB' 'Cached: 18226676 kB' 'SwapCached: 0 kB' 'Active: 15037180 kB' 'Inactive: 3731776 kB' 'Active(anon): 14479052 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 535724 kB' 'Mapped: 203300 kB' 'Shmem: 13946668 kB' 'KReclaimable: 526360 kB' 'Slab: 948996 kB' 'SReclaimable: 526360 kB' 'SUnreclaim: 422636 kB' 'KernelStack: 16384 kB' 'PageTables: 9252 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 15859272 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214232 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.221 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.221 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:04 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.222 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:04 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.222 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:04 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.222 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:04 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.222 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:04 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.222 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:04 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.222 13:17:04 -- setup/common.sh@33 -- # echo 0 00:04:46.222 13:17:04 -- setup/common.sh@33 -- # return 0 00:04:46.222 13:17:04 -- setup/hugepages.sh@97 -- # anon=0 00:04:46.222 13:17:04 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:46.222 13:17:04 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:46.222 13:17:04 -- setup/common.sh@18 -- # local node= 00:04:46.222 13:17:04 -- setup/common.sh@19 -- # local var val 00:04:46.222 13:17:04 -- setup/common.sh@20 -- # local mem_f mem 00:04:46.222 13:17:04 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.222 13:17:04 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.222 13:17:04 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.222 13:17:04 -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.222 13:17:04 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.222 13:17:04 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68528696 kB' 'MemAvailable: 72508736 kB' 'Buffers: 9896 kB' 'Cached: 18226676 kB' 'SwapCached: 0 kB' 'Active: 15037324 kB' 'Inactive: 3731776 kB' 'Active(anon): 14479196 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 535884 kB' 'Mapped: 203300 kB' 'Shmem: 13946668 kB' 'KReclaimable: 526320 kB' 'Slab: 948944 kB' 'SReclaimable: 526320 kB' 'SUnreclaim: 422624 kB' 'KernelStack: 16368 kB' 'PageTables: 9208 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 15859284 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214200 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:04:46.222 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:04 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.222 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:04 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.222 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:04 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.222 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:04 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:04 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.222 13:17:04 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:04 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.222 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.222 13:17:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.223 13:17:05 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.223 13:17:05 -- setup/common.sh@33 -- # echo 0 00:04:46.223 13:17:05 -- setup/common.sh@33 -- # return 0 00:04:46.223 13:17:05 -- setup/hugepages.sh@99 -- # surp=0 00:04:46.223 13:17:05 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:46.223 13:17:05 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:46.223 13:17:05 -- setup/common.sh@18 -- # local node= 00:04:46.223 13:17:05 -- setup/common.sh@19 -- # local var val 00:04:46.223 13:17:05 -- setup/common.sh@20 -- # local mem_f mem 00:04:46.223 13:17:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.223 13:17:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.223 13:17:05 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.223 13:17:05 -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.223 13:17:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.223 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68529008 kB' 'MemAvailable: 72509048 kB' 'Buffers: 9896 kB' 'Cached: 18226688 kB' 'SwapCached: 0 kB' 'Active: 15037104 kB' 'Inactive: 3731776 kB' 'Active(anon): 14478976 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 535680 kB' 'Mapped: 203252 kB' 'Shmem: 13946680 kB' 'KReclaimable: 526320 kB' 'Slab: 948936 kB' 'SReclaimable: 526320 kB' 'SUnreclaim: 422616 kB' 'KernelStack: 16336 kB' 'PageTables: 8764 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 15859296 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214200 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.224 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.224 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.225 13:17:05 -- setup/common.sh@33 -- # echo 0 00:04:46.225 13:17:05 -- setup/common.sh@33 -- # return 0 00:04:46.225 13:17:05 -- setup/hugepages.sh@100 -- # resv=0 00:04:46.225 13:17:05 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:46.225 nr_hugepages=1024 00:04:46.225 13:17:05 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:46.225 resv_hugepages=0 00:04:46.225 13:17:05 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:46.225 surplus_hugepages=0 00:04:46.225 13:17:05 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:46.225 anon_hugepages=0 00:04:46.225 13:17:05 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:46.225 13:17:05 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:46.225 13:17:05 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:46.225 13:17:05 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:46.225 13:17:05 -- setup/common.sh@18 -- # local node= 00:04:46.225 13:17:05 -- setup/common.sh@19 -- # local var val 00:04:46.225 13:17:05 -- setup/common.sh@20 -- # local mem_f mem 00:04:46.225 13:17:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.225 13:17:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.225 13:17:05 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.225 13:17:05 -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.225 13:17:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68528984 kB' 'MemAvailable: 72509024 kB' 'Buffers: 9896 kB' 'Cached: 18226704 kB' 'SwapCached: 0 kB' 'Active: 15037424 kB' 'Inactive: 3731776 kB' 'Active(anon): 14479296 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 535972 kB' 'Mapped: 203252 kB' 'Shmem: 13946696 kB' 'KReclaimable: 526320 kB' 'Slab: 948936 kB' 'SReclaimable: 526320 kB' 'SUnreclaim: 422616 kB' 'KernelStack: 16352 kB' 'PageTables: 8820 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 15859312 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214200 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.225 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.225 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.226 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.226 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.227 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.227 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.487 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.487 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.487 13:17:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.487 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.487 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.487 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.487 13:17:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.487 13:17:05 -- setup/common.sh@33 -- # echo 1024 00:04:46.487 13:17:05 -- setup/common.sh@33 -- # return 0 00:04:46.487 13:17:05 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:46.487 13:17:05 -- setup/hugepages.sh@112 -- # get_nodes 00:04:46.487 13:17:05 -- setup/hugepages.sh@27 -- # local node 00:04:46.487 13:17:05 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:46.487 13:17:05 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:46.487 13:17:05 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:46.487 13:17:05 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:46.487 13:17:05 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:46.487 13:17:05 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:46.487 13:17:05 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:46.487 13:17:05 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:46.487 13:17:05 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:46.487 13:17:05 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:46.487 13:17:05 -- setup/common.sh@18 -- # local node=0 00:04:46.487 13:17:05 -- setup/common.sh@19 -- # local var val 00:04:46.487 13:17:05 -- setup/common.sh@20 -- # local mem_f mem 00:04:46.487 13:17:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.487 13:17:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:46.487 13:17:05 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:46.487 13:17:05 -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.487 13:17:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.487 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.487 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.487 13:17:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116968 kB' 'MemFree: 39255236 kB' 'MemUsed: 8861732 kB' 'SwapCached: 0 kB' 'Active: 4550956 kB' 'Inactive: 285836 kB' 'Active(anon): 4133304 kB' 'Inactive(anon): 0 kB' 'Active(file): 417652 kB' 'Inactive(file): 285836 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4528328 kB' 'Mapped: 113652 kB' 'AnonPages: 311708 kB' 'Shmem: 3824840 kB' 'KernelStack: 8680 kB' 'PageTables: 5628 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 361324 kB' 'Slab: 600240 kB' 'SReclaimable: 361324 kB' 'SUnreclaim: 238916 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:46.487 13:17:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.487 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.487 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.487 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.487 13:17:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.487 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.487 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.487 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.487 13:17:05 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.487 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.487 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.487 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.487 13:17:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.487 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.487 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.487 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.487 13:17:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.487 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.487 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.487 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.487 13:17:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.487 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.487 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.487 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.487 13:17:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.487 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # continue 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:46.488 13:17:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:46.488 13:17:05 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.488 13:17:05 -- setup/common.sh@33 -- # echo 0 00:04:46.488 13:17:05 -- setup/common.sh@33 -- # return 0 00:04:46.488 13:17:05 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:46.488 13:17:05 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:46.488 13:17:05 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:46.488 13:17:05 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:46.488 13:17:05 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:46.488 node0=1024 expecting 1024 00:04:46.488 13:17:05 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:46.489 00:04:46.489 real 0m8.973s 00:04:46.489 user 0m1.932s 00:04:46.489 sys 0m3.997s 00:04:46.489 13:17:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:46.489 13:17:05 -- common/autotest_common.sh@10 -- # set +x 00:04:46.489 ************************************ 00:04:46.489 END TEST default_setup 00:04:46.489 ************************************ 00:04:46.489 13:17:05 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:46.489 13:17:05 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:46.489 13:17:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:46.489 13:17:05 -- common/autotest_common.sh@10 -- # set +x 00:04:46.489 ************************************ 00:04:46.489 START TEST per_node_1G_alloc 00:04:46.489 ************************************ 00:04:46.489 13:17:05 -- common/autotest_common.sh@1104 -- # per_node_1G_alloc 00:04:46.489 13:17:05 -- setup/hugepages.sh@143 -- # local IFS=, 00:04:46.489 13:17:05 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:46.489 13:17:05 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:46.489 13:17:05 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:46.489 13:17:05 -- setup/hugepages.sh@51 -- # shift 00:04:46.489 13:17:05 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:46.489 13:17:05 -- setup/hugepages.sh@52 -- # local node_ids 00:04:46.489 13:17:05 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:46.489 13:17:05 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:46.489 13:17:05 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:46.489 13:17:05 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:46.489 13:17:05 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:46.489 13:17:05 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:46.489 13:17:05 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:46.489 13:17:05 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:46.489 13:17:05 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:46.489 13:17:05 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:46.489 13:17:05 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:46.489 13:17:05 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:46.489 13:17:05 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:46.489 13:17:05 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:46.489 13:17:05 -- setup/hugepages.sh@73 -- # return 0 00:04:46.489 13:17:05 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:46.489 13:17:05 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:46.489 13:17:05 -- setup/hugepages.sh@146 -- # setup output 00:04:46.489 13:17:05 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:46.489 13:17:05 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:50.680 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:50.680 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:50.680 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:50.680 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:50.680 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:50.680 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:50.680 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:50.680 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:50.680 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:50.680 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:50.680 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:50.680 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:50.680 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:50.680 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:50.680 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:50.680 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:50.680 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:52.060 13:17:10 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:52.060 13:17:10 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:52.060 13:17:10 -- setup/hugepages.sh@89 -- # local node 00:04:52.060 13:17:10 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:52.060 13:17:10 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:52.060 13:17:10 -- setup/hugepages.sh@92 -- # local surp 00:04:52.060 13:17:10 -- setup/hugepages.sh@93 -- # local resv 00:04:52.060 13:17:10 -- setup/hugepages.sh@94 -- # local anon 00:04:52.060 13:17:10 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:52.060 13:17:10 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:52.060 13:17:10 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:52.060 13:17:10 -- setup/common.sh@18 -- # local node= 00:04:52.060 13:17:10 -- setup/common.sh@19 -- # local var val 00:04:52.060 13:17:10 -- setup/common.sh@20 -- # local mem_f mem 00:04:52.060 13:17:10 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:52.060 13:17:10 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:52.060 13:17:10 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:52.060 13:17:10 -- setup/common.sh@28 -- # mapfile -t mem 00:04:52.060 13:17:10 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68508332 kB' 'MemAvailable: 72488340 kB' 'Buffers: 9896 kB' 'Cached: 18226824 kB' 'SwapCached: 0 kB' 'Active: 15036280 kB' 'Inactive: 3731776 kB' 'Active(anon): 14478152 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534184 kB' 'Mapped: 202372 kB' 'Shmem: 13946816 kB' 'KReclaimable: 526288 kB' 'Slab: 948180 kB' 'SReclaimable: 526288 kB' 'SUnreclaim: 421892 kB' 'KernelStack: 16400 kB' 'PageTables: 8748 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 15848868 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214184 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.060 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.060 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.061 13:17:10 -- setup/common.sh@33 -- # echo 0 00:04:52.061 13:17:10 -- setup/common.sh@33 -- # return 0 00:04:52.061 13:17:10 -- setup/hugepages.sh@97 -- # anon=0 00:04:52.061 13:17:10 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:52.061 13:17:10 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:52.061 13:17:10 -- setup/common.sh@18 -- # local node= 00:04:52.061 13:17:10 -- setup/common.sh@19 -- # local var val 00:04:52.061 13:17:10 -- setup/common.sh@20 -- # local mem_f mem 00:04:52.061 13:17:10 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:52.061 13:17:10 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:52.061 13:17:10 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:52.061 13:17:10 -- setup/common.sh@28 -- # mapfile -t mem 00:04:52.061 13:17:10 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68509268 kB' 'MemAvailable: 72489276 kB' 'Buffers: 9896 kB' 'Cached: 18226828 kB' 'SwapCached: 0 kB' 'Active: 15035896 kB' 'Inactive: 3731776 kB' 'Active(anon): 14477768 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534236 kB' 'Mapped: 202248 kB' 'Shmem: 13946820 kB' 'KReclaimable: 526288 kB' 'Slab: 948140 kB' 'SReclaimable: 526288 kB' 'SUnreclaim: 421852 kB' 'KernelStack: 16400 kB' 'PageTables: 8768 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 15848880 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214184 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.061 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.061 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.062 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.062 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.325 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.325 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.326 13:17:10 -- setup/common.sh@33 -- # echo 0 00:04:52.326 13:17:10 -- setup/common.sh@33 -- # return 0 00:04:52.326 13:17:10 -- setup/hugepages.sh@99 -- # surp=0 00:04:52.326 13:17:10 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:52.326 13:17:10 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:52.326 13:17:10 -- setup/common.sh@18 -- # local node= 00:04:52.326 13:17:10 -- setup/common.sh@19 -- # local var val 00:04:52.326 13:17:10 -- setup/common.sh@20 -- # local mem_f mem 00:04:52.326 13:17:10 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:52.326 13:17:10 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:52.326 13:17:10 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:52.326 13:17:10 -- setup/common.sh@28 -- # mapfile -t mem 00:04:52.326 13:17:10 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:52.326 13:17:10 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68509016 kB' 'MemAvailable: 72489024 kB' 'Buffers: 9896 kB' 'Cached: 18226840 kB' 'SwapCached: 0 kB' 'Active: 15035736 kB' 'Inactive: 3731776 kB' 'Active(anon): 14477608 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534100 kB' 'Mapped: 202248 kB' 'Shmem: 13946832 kB' 'KReclaimable: 526288 kB' 'Slab: 948140 kB' 'SReclaimable: 526288 kB' 'SUnreclaim: 421852 kB' 'KernelStack: 16400 kB' 'PageTables: 8768 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 15848896 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214184 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.326 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.326 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.327 13:17:10 -- setup/common.sh@33 -- # echo 0 00:04:52.327 13:17:10 -- setup/common.sh@33 -- # return 0 00:04:52.327 13:17:10 -- setup/hugepages.sh@100 -- # resv=0 00:04:52.327 13:17:10 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:52.327 nr_hugepages=1024 00:04:52.327 13:17:10 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:52.327 resv_hugepages=0 00:04:52.327 13:17:10 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:52.327 surplus_hugepages=0 00:04:52.327 13:17:10 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:52.327 anon_hugepages=0 00:04:52.327 13:17:10 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:52.327 13:17:10 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:52.327 13:17:10 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:52.327 13:17:10 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:52.327 13:17:10 -- setup/common.sh@18 -- # local node= 00:04:52.327 13:17:10 -- setup/common.sh@19 -- # local var val 00:04:52.327 13:17:10 -- setup/common.sh@20 -- # local mem_f mem 00:04:52.327 13:17:10 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:52.327 13:17:10 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:52.327 13:17:10 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:52.327 13:17:10 -- setup/common.sh@28 -- # mapfile -t mem 00:04:52.327 13:17:10 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68510340 kB' 'MemAvailable: 72490244 kB' 'Buffers: 9896 kB' 'Cached: 18226852 kB' 'SwapCached: 0 kB' 'Active: 15035472 kB' 'Inactive: 3731776 kB' 'Active(anon): 14477344 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533932 kB' 'Mapped: 202248 kB' 'Shmem: 13946844 kB' 'KReclaimable: 526184 kB' 'Slab: 948032 kB' 'SReclaimable: 526184 kB' 'SUnreclaim: 421848 kB' 'KernelStack: 16384 kB' 'PageTables: 8716 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 15848908 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214136 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.327 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.327 13:17:10 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:10 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:10 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.328 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.328 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.329 13:17:11 -- setup/common.sh@33 -- # echo 1024 00:04:52.329 13:17:11 -- setup/common.sh@33 -- # return 0 00:04:52.329 13:17:11 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:52.329 13:17:11 -- setup/hugepages.sh@112 -- # get_nodes 00:04:52.329 13:17:11 -- setup/hugepages.sh@27 -- # local node 00:04:52.329 13:17:11 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:52.329 13:17:11 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:52.329 13:17:11 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:52.329 13:17:11 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:52.329 13:17:11 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:52.329 13:17:11 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:52.329 13:17:11 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:52.329 13:17:11 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:52.329 13:17:11 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:52.329 13:17:11 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:52.329 13:17:11 -- setup/common.sh@18 -- # local node=0 00:04:52.329 13:17:11 -- setup/common.sh@19 -- # local var val 00:04:52.329 13:17:11 -- setup/common.sh@20 -- # local mem_f mem 00:04:52.329 13:17:11 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:52.329 13:17:11 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:52.329 13:17:11 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:52.329 13:17:11 -- setup/common.sh@28 -- # mapfile -t mem 00:04:52.329 13:17:11 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116968 kB' 'MemFree: 40293584 kB' 'MemUsed: 7823384 kB' 'SwapCached: 0 kB' 'Active: 4548740 kB' 'Inactive: 285836 kB' 'Active(anon): 4131088 kB' 'Inactive(anon): 0 kB' 'Active(file): 417652 kB' 'Inactive(file): 285836 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4528376 kB' 'Mapped: 113292 kB' 'AnonPages: 309352 kB' 'Shmem: 3824888 kB' 'KernelStack: 8696 kB' 'PageTables: 5448 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 361188 kB' 'Slab: 600080 kB' 'SReclaimable: 361188 kB' 'SUnreclaim: 238892 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.329 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.329 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@33 -- # echo 0 00:04:52.330 13:17:11 -- setup/common.sh@33 -- # return 0 00:04:52.330 13:17:11 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:52.330 13:17:11 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:52.330 13:17:11 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:52.330 13:17:11 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:52.330 13:17:11 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:52.330 13:17:11 -- setup/common.sh@18 -- # local node=1 00:04:52.330 13:17:11 -- setup/common.sh@19 -- # local var val 00:04:52.330 13:17:11 -- setup/common.sh@20 -- # local mem_f mem 00:04:52.330 13:17:11 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:52.330 13:17:11 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:52.330 13:17:11 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:52.330 13:17:11 -- setup/common.sh@28 -- # mapfile -t mem 00:04:52.330 13:17:11 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:52.330 13:17:11 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176560 kB' 'MemFree: 28222536 kB' 'MemUsed: 15954024 kB' 'SwapCached: 0 kB' 'Active: 10486584 kB' 'Inactive: 3445940 kB' 'Active(anon): 10346108 kB' 'Inactive(anon): 0 kB' 'Active(file): 140476 kB' 'Inactive(file): 3445940 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 13708400 kB' 'Mapped: 88956 kB' 'AnonPages: 224324 kB' 'Shmem: 10121984 kB' 'KernelStack: 7656 kB' 'PageTables: 3184 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 164996 kB' 'Slab: 347952 kB' 'SReclaimable: 164996 kB' 'SUnreclaim: 182956 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.330 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.330 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.331 13:17:11 -- setup/common.sh@32 -- # continue 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # IFS=': ' 00:04:52.331 13:17:11 -- setup/common.sh@31 -- # read -r var val _ 00:04:52.332 13:17:11 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.332 13:17:11 -- setup/common.sh@33 -- # echo 0 00:04:52.332 13:17:11 -- setup/common.sh@33 -- # return 0 00:04:52.332 13:17:11 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:52.332 13:17:11 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:52.332 13:17:11 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:52.332 13:17:11 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:52.332 13:17:11 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:52.332 node0=512 expecting 512 00:04:52.332 13:17:11 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:52.332 13:17:11 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:52.332 13:17:11 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:52.332 13:17:11 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:52.332 node1=512 expecting 512 00:04:52.332 13:17:11 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:52.332 00:04:52.332 real 0m5.920s 00:04:52.332 user 0m2.008s 00:04:52.332 sys 0m3.932s 00:04:52.332 13:17:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:52.332 13:17:11 -- common/autotest_common.sh@10 -- # set +x 00:04:52.332 ************************************ 00:04:52.332 END TEST per_node_1G_alloc 00:04:52.332 ************************************ 00:04:52.332 13:17:11 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:52.332 13:17:11 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:52.332 13:17:11 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:52.332 13:17:11 -- common/autotest_common.sh@10 -- # set +x 00:04:52.332 ************************************ 00:04:52.332 START TEST even_2G_alloc 00:04:52.332 ************************************ 00:04:52.332 13:17:11 -- common/autotest_common.sh@1104 -- # even_2G_alloc 00:04:52.332 13:17:11 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:52.332 13:17:11 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:52.332 13:17:11 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:52.332 13:17:11 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:52.332 13:17:11 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:52.332 13:17:11 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:52.332 13:17:11 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:52.332 13:17:11 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:52.332 13:17:11 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:52.332 13:17:11 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:52.332 13:17:11 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:52.332 13:17:11 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:52.332 13:17:11 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:52.332 13:17:11 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:52.332 13:17:11 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:52.332 13:17:11 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:52.332 13:17:11 -- setup/hugepages.sh@83 -- # : 512 00:04:52.332 13:17:11 -- setup/hugepages.sh@84 -- # : 1 00:04:52.332 13:17:11 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:52.332 13:17:11 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:52.332 13:17:11 -- setup/hugepages.sh@83 -- # : 0 00:04:52.332 13:17:11 -- setup/hugepages.sh@84 -- # : 0 00:04:52.332 13:17:11 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:52.332 13:17:11 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:52.332 13:17:11 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:52.332 13:17:11 -- setup/hugepages.sh@153 -- # setup output 00:04:52.332 13:17:11 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:52.332 13:17:11 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:56.596 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:56.596 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:56.596 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:56.596 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:56.596 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:56.596 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:56.596 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:56.596 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:56.596 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:56.596 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:56.596 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:56.596 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:56.596 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:56.596 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:56.596 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:56.596 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:56.596 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:58.505 13:17:17 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:58.505 13:17:17 -- setup/hugepages.sh@89 -- # local node 00:04:58.505 13:17:17 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:58.505 13:17:17 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:58.505 13:17:17 -- setup/hugepages.sh@92 -- # local surp 00:04:58.505 13:17:17 -- setup/hugepages.sh@93 -- # local resv 00:04:58.505 13:17:17 -- setup/hugepages.sh@94 -- # local anon 00:04:58.505 13:17:17 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:58.505 13:17:17 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:58.505 13:17:17 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:58.505 13:17:17 -- setup/common.sh@18 -- # local node= 00:04:58.505 13:17:17 -- setup/common.sh@19 -- # local var val 00:04:58.505 13:17:17 -- setup/common.sh@20 -- # local mem_f mem 00:04:58.505 13:17:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:58.505 13:17:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:58.505 13:17:17 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:58.505 13:17:17 -- setup/common.sh@28 -- # mapfile -t mem 00:04:58.505 13:17:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.505 13:17:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68505352 kB' 'MemAvailable: 72485256 kB' 'Buffers: 9896 kB' 'Cached: 18226988 kB' 'SwapCached: 0 kB' 'Active: 15035744 kB' 'Inactive: 3731776 kB' 'Active(anon): 14477616 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532928 kB' 'Mapped: 202448 kB' 'Shmem: 13946980 kB' 'KReclaimable: 526184 kB' 'Slab: 948108 kB' 'SReclaimable: 526184 kB' 'SUnreclaim: 421924 kB' 'KernelStack: 16352 kB' 'PageTables: 8652 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 15849684 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214312 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.505 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.505 13:17:17 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.506 13:17:17 -- setup/common.sh@33 -- # echo 0 00:04:58.506 13:17:17 -- setup/common.sh@33 -- # return 0 00:04:58.506 13:17:17 -- setup/hugepages.sh@97 -- # anon=0 00:04:58.506 13:17:17 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:58.506 13:17:17 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:58.506 13:17:17 -- setup/common.sh@18 -- # local node= 00:04:58.506 13:17:17 -- setup/common.sh@19 -- # local var val 00:04:58.506 13:17:17 -- setup/common.sh@20 -- # local mem_f mem 00:04:58.506 13:17:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:58.506 13:17:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:58.506 13:17:17 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:58.506 13:17:17 -- setup/common.sh@28 -- # mapfile -t mem 00:04:58.506 13:17:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68505776 kB' 'MemAvailable: 72485680 kB' 'Buffers: 9896 kB' 'Cached: 18226992 kB' 'SwapCached: 0 kB' 'Active: 15036116 kB' 'Inactive: 3731776 kB' 'Active(anon): 14477988 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533292 kB' 'Mapped: 202448 kB' 'Shmem: 13946984 kB' 'KReclaimable: 526184 kB' 'Slab: 948108 kB' 'SReclaimable: 526184 kB' 'SUnreclaim: 421924 kB' 'KernelStack: 16352 kB' 'PageTables: 8656 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 15849696 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214296 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.506 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.506 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.507 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.507 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.508 13:17:17 -- setup/common.sh@33 -- # echo 0 00:04:58.508 13:17:17 -- setup/common.sh@33 -- # return 0 00:04:58.508 13:17:17 -- setup/hugepages.sh@99 -- # surp=0 00:04:58.508 13:17:17 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:58.508 13:17:17 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:58.508 13:17:17 -- setup/common.sh@18 -- # local node= 00:04:58.508 13:17:17 -- setup/common.sh@19 -- # local var val 00:04:58.508 13:17:17 -- setup/common.sh@20 -- # local mem_f mem 00:04:58.508 13:17:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:58.508 13:17:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:58.508 13:17:17 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:58.508 13:17:17 -- setup/common.sh@28 -- # mapfile -t mem 00:04:58.508 13:17:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68505776 kB' 'MemAvailable: 72485680 kB' 'Buffers: 9896 kB' 'Cached: 18226992 kB' 'SwapCached: 0 kB' 'Active: 15035516 kB' 'Inactive: 3731776 kB' 'Active(anon): 14477388 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533656 kB' 'Mapped: 202316 kB' 'Shmem: 13946984 kB' 'KReclaimable: 526184 kB' 'Slab: 948108 kB' 'SReclaimable: 526184 kB' 'SUnreclaim: 421924 kB' 'KernelStack: 16352 kB' 'PageTables: 8632 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 15849712 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214328 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.508 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.508 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.509 13:17:17 -- setup/common.sh@33 -- # echo 0 00:04:58.509 13:17:17 -- setup/common.sh@33 -- # return 0 00:04:58.509 13:17:17 -- setup/hugepages.sh@100 -- # resv=0 00:04:58.509 13:17:17 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:58.509 nr_hugepages=1024 00:04:58.509 13:17:17 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:58.509 resv_hugepages=0 00:04:58.509 13:17:17 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:58.509 surplus_hugepages=0 00:04:58.509 13:17:17 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:58.509 anon_hugepages=0 00:04:58.509 13:17:17 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:58.509 13:17:17 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:58.509 13:17:17 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:58.509 13:17:17 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:58.509 13:17:17 -- setup/common.sh@18 -- # local node= 00:04:58.509 13:17:17 -- setup/common.sh@19 -- # local var val 00:04:58.509 13:17:17 -- setup/common.sh@20 -- # local mem_f mem 00:04:58.509 13:17:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:58.509 13:17:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:58.509 13:17:17 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:58.509 13:17:17 -- setup/common.sh@28 -- # mapfile -t mem 00:04:58.509 13:17:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68505776 kB' 'MemAvailable: 72485680 kB' 'Buffers: 9896 kB' 'Cached: 18227000 kB' 'SwapCached: 0 kB' 'Active: 15035124 kB' 'Inactive: 3731776 kB' 'Active(anon): 14476996 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533256 kB' 'Mapped: 202316 kB' 'Shmem: 13946992 kB' 'KReclaimable: 526184 kB' 'Slab: 948108 kB' 'SReclaimable: 526184 kB' 'SUnreclaim: 421924 kB' 'KernelStack: 16336 kB' 'PageTables: 8576 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 15849728 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214328 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.509 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.509 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.510 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.510 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:58.511 13:17:17 -- setup/common.sh@33 -- # echo 1024 00:04:58.511 13:17:17 -- setup/common.sh@33 -- # return 0 00:04:58.511 13:17:17 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:58.511 13:17:17 -- setup/hugepages.sh@112 -- # get_nodes 00:04:58.511 13:17:17 -- setup/hugepages.sh@27 -- # local node 00:04:58.511 13:17:17 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:58.511 13:17:17 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:58.511 13:17:17 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:58.511 13:17:17 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:58.511 13:17:17 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:58.511 13:17:17 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:58.511 13:17:17 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:58.511 13:17:17 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:58.511 13:17:17 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:58.511 13:17:17 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:58.511 13:17:17 -- setup/common.sh@18 -- # local node=0 00:04:58.511 13:17:17 -- setup/common.sh@19 -- # local var val 00:04:58.511 13:17:17 -- setup/common.sh@20 -- # local mem_f mem 00:04:58.511 13:17:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:58.511 13:17:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:58.511 13:17:17 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:58.511 13:17:17 -- setup/common.sh@28 -- # mapfile -t mem 00:04:58.511 13:17:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116968 kB' 'MemFree: 40287452 kB' 'MemUsed: 7829516 kB' 'SwapCached: 0 kB' 'Active: 4548912 kB' 'Inactive: 285836 kB' 'Active(anon): 4131260 kB' 'Inactive(anon): 0 kB' 'Active(file): 417652 kB' 'Inactive(file): 285836 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4528448 kB' 'Mapped: 113296 kB' 'AnonPages: 309480 kB' 'Shmem: 3824960 kB' 'KernelStack: 8696 kB' 'PageTables: 5560 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 361188 kB' 'Slab: 599932 kB' 'SReclaimable: 361188 kB' 'SUnreclaim: 238744 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.511 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.511 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@33 -- # echo 0 00:04:58.512 13:17:17 -- setup/common.sh@33 -- # return 0 00:04:58.512 13:17:17 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:58.512 13:17:17 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:58.512 13:17:17 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:58.512 13:17:17 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:58.512 13:17:17 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:58.512 13:17:17 -- setup/common.sh@18 -- # local node=1 00:04:58.512 13:17:17 -- setup/common.sh@19 -- # local var val 00:04:58.512 13:17:17 -- setup/common.sh@20 -- # local mem_f mem 00:04:58.512 13:17:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:58.512 13:17:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:58.512 13:17:17 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:58.512 13:17:17 -- setup/common.sh@28 -- # mapfile -t mem 00:04:58.512 13:17:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176560 kB' 'MemFree: 28218072 kB' 'MemUsed: 15958488 kB' 'SwapCached: 0 kB' 'Active: 10485928 kB' 'Inactive: 3445940 kB' 'Active(anon): 10345452 kB' 'Inactive(anon): 0 kB' 'Active(file): 140476 kB' 'Inactive(file): 3445940 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 13708456 kB' 'Mapped: 89020 kB' 'AnonPages: 223520 kB' 'Shmem: 10122040 kB' 'KernelStack: 7672 kB' 'PageTables: 3120 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 164996 kB' 'Slab: 348176 kB' 'SReclaimable: 164996 kB' 'SUnreclaim: 183180 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.512 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.512 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # continue 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # IFS=': ' 00:04:58.513 13:17:17 -- setup/common.sh@31 -- # read -r var val _ 00:04:58.513 13:17:17 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.513 13:17:17 -- setup/common.sh@33 -- # echo 0 00:04:58.513 13:17:17 -- setup/common.sh@33 -- # return 0 00:04:58.513 13:17:17 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:58.513 13:17:17 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:58.513 13:17:17 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:58.513 13:17:17 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:58.513 13:17:17 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:58.513 node0=512 expecting 512 00:04:58.513 13:17:17 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:58.513 13:17:17 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:58.513 13:17:17 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:58.513 13:17:17 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:58.513 node1=512 expecting 512 00:04:58.513 13:17:17 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:58.513 00:04:58.513 real 0m6.206s 00:04:58.513 user 0m2.125s 00:04:58.513 sys 0m4.161s 00:04:58.513 13:17:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:58.513 13:17:17 -- common/autotest_common.sh@10 -- # set +x 00:04:58.513 ************************************ 00:04:58.513 END TEST even_2G_alloc 00:04:58.513 ************************************ 00:04:58.513 13:17:17 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:58.513 13:17:17 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:58.513 13:17:17 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:58.513 13:17:17 -- common/autotest_common.sh@10 -- # set +x 00:04:58.772 ************************************ 00:04:58.772 START TEST odd_alloc 00:04:58.772 ************************************ 00:04:58.772 13:17:17 -- common/autotest_common.sh@1104 -- # odd_alloc 00:04:58.772 13:17:17 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:58.772 13:17:17 -- setup/hugepages.sh@49 -- # local size=2098176 00:04:58.772 13:17:17 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:58.772 13:17:17 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:58.772 13:17:17 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:58.772 13:17:17 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:58.772 13:17:17 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:58.772 13:17:17 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:58.772 13:17:17 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:58.772 13:17:17 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:58.772 13:17:17 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:58.772 13:17:17 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:58.772 13:17:17 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:58.772 13:17:17 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:58.772 13:17:17 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:58.772 13:17:17 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:58.772 13:17:17 -- setup/hugepages.sh@83 -- # : 513 00:04:58.772 13:17:17 -- setup/hugepages.sh@84 -- # : 1 00:04:58.772 13:17:17 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:58.772 13:17:17 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:58.772 13:17:17 -- setup/hugepages.sh@83 -- # : 0 00:04:58.772 13:17:17 -- setup/hugepages.sh@84 -- # : 0 00:04:58.772 13:17:17 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:58.772 13:17:17 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:58.772 13:17:17 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:58.772 13:17:17 -- setup/hugepages.sh@160 -- # setup output 00:04:58.772 13:17:17 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:58.772 13:17:17 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:02.961 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:02.961 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:02.961 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:02.961 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:02.961 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:02.961 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:02.961 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:02.961 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:02.961 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:02.961 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:02.961 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:02.961 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:02.961 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:02.961 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:02.961 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:02.961 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:02.961 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:04.339 13:17:23 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:05:04.339 13:17:23 -- setup/hugepages.sh@89 -- # local node 00:05:04.339 13:17:23 -- setup/hugepages.sh@90 -- # local sorted_t 00:05:04.339 13:17:23 -- setup/hugepages.sh@91 -- # local sorted_s 00:05:04.339 13:17:23 -- setup/hugepages.sh@92 -- # local surp 00:05:04.339 13:17:23 -- setup/hugepages.sh@93 -- # local resv 00:05:04.339 13:17:23 -- setup/hugepages.sh@94 -- # local anon 00:05:04.339 13:17:23 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:04.339 13:17:23 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:04.339 13:17:23 -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:04.339 13:17:23 -- setup/common.sh@18 -- # local node= 00:05:04.339 13:17:23 -- setup/common.sh@19 -- # local var val 00:05:04.339 13:17:23 -- setup/common.sh@20 -- # local mem_f mem 00:05:04.339 13:17:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.339 13:17:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:04.339 13:17:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:04.339 13:17:23 -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.339 13:17:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68522172 kB' 'MemAvailable: 72501980 kB' 'Buffers: 9896 kB' 'Cached: 18227136 kB' 'SwapCached: 0 kB' 'Active: 15035516 kB' 'Inactive: 3731776 kB' 'Active(anon): 14477388 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533708 kB' 'Mapped: 202476 kB' 'Shmem: 13947128 kB' 'KReclaimable: 526088 kB' 'Slab: 947668 kB' 'SReclaimable: 526088 kB' 'SUnreclaim: 421580 kB' 'KernelStack: 16416 kB' 'PageTables: 8808 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485768 kB' 'Committed_AS: 15849988 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214248 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.339 13:17:23 -- setup/common.sh@33 -- # echo 0 00:05:04.339 13:17:23 -- setup/common.sh@33 -- # return 0 00:05:04.339 13:17:23 -- setup/hugepages.sh@97 -- # anon=0 00:05:04.339 13:17:23 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:04.339 13:17:23 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:04.339 13:17:23 -- setup/common.sh@18 -- # local node= 00:05:04.339 13:17:23 -- setup/common.sh@19 -- # local var val 00:05:04.339 13:17:23 -- setup/common.sh@20 -- # local mem_f mem 00:05:04.339 13:17:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.339 13:17:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:04.339 13:17:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:04.339 13:17:23 -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.339 13:17:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68524032 kB' 'MemAvailable: 72503840 kB' 'Buffers: 9896 kB' 'Cached: 18227144 kB' 'SwapCached: 0 kB' 'Active: 15035400 kB' 'Inactive: 3731776 kB' 'Active(anon): 14477272 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533432 kB' 'Mapped: 202388 kB' 'Shmem: 13947136 kB' 'KReclaimable: 526088 kB' 'Slab: 947644 kB' 'SReclaimable: 526088 kB' 'SUnreclaim: 421556 kB' 'KernelStack: 16384 kB' 'PageTables: 8688 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485768 kB' 'Committed_AS: 15850004 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214216 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.339 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.339 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.340 13:17:23 -- setup/common.sh@33 -- # echo 0 00:05:04.340 13:17:23 -- setup/common.sh@33 -- # return 0 00:05:04.340 13:17:23 -- setup/hugepages.sh@99 -- # surp=0 00:05:04.340 13:17:23 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:04.340 13:17:23 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:04.340 13:17:23 -- setup/common.sh@18 -- # local node= 00:05:04.340 13:17:23 -- setup/common.sh@19 -- # local var val 00:05:04.340 13:17:23 -- setup/common.sh@20 -- # local mem_f mem 00:05:04.340 13:17:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.340 13:17:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:04.340 13:17:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:04.340 13:17:23 -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.340 13:17:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68524032 kB' 'MemAvailable: 72503840 kB' 'Buffers: 9896 kB' 'Cached: 18227144 kB' 'SwapCached: 0 kB' 'Active: 15035100 kB' 'Inactive: 3731776 kB' 'Active(anon): 14476972 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533124 kB' 'Mapped: 202388 kB' 'Shmem: 13947136 kB' 'KReclaimable: 526088 kB' 'Slab: 947644 kB' 'SReclaimable: 526088 kB' 'SUnreclaim: 421556 kB' 'KernelStack: 16384 kB' 'PageTables: 8688 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485768 kB' 'Committed_AS: 15850148 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214216 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.340 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.340 13:17:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.341 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.341 13:17:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.602 13:17:23 -- setup/common.sh@33 -- # echo 0 00:05:04.602 13:17:23 -- setup/common.sh@33 -- # return 0 00:05:04.602 13:17:23 -- setup/hugepages.sh@100 -- # resv=0 00:05:04.602 13:17:23 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:05:04.602 nr_hugepages=1025 00:05:04.602 13:17:23 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:04.602 resv_hugepages=0 00:05:04.602 13:17:23 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:04.602 surplus_hugepages=0 00:05:04.602 13:17:23 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:04.602 anon_hugepages=0 00:05:04.602 13:17:23 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:04.602 13:17:23 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:05:04.602 13:17:23 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:04.602 13:17:23 -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:04.602 13:17:23 -- setup/common.sh@18 -- # local node= 00:05:04.602 13:17:23 -- setup/common.sh@19 -- # local var val 00:05:04.602 13:17:23 -- setup/common.sh@20 -- # local mem_f mem 00:05:04.602 13:17:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.602 13:17:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:04.602 13:17:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:04.602 13:17:23 -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.602 13:17:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68527516 kB' 'MemAvailable: 72507324 kB' 'Buffers: 9896 kB' 'Cached: 18227176 kB' 'SwapCached: 0 kB' 'Active: 15035292 kB' 'Inactive: 3731776 kB' 'Active(anon): 14477164 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533264 kB' 'Mapped: 202388 kB' 'Shmem: 13947168 kB' 'KReclaimable: 526088 kB' 'Slab: 947644 kB' 'SReclaimable: 526088 kB' 'SUnreclaim: 421556 kB' 'KernelStack: 16368 kB' 'PageTables: 8632 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485768 kB' 'Committed_AS: 15850168 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214216 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.602 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.602 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.603 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.603 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.604 13:17:23 -- setup/common.sh@33 -- # echo 1025 00:05:04.604 13:17:23 -- setup/common.sh@33 -- # return 0 00:05:04.604 13:17:23 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:04.604 13:17:23 -- setup/hugepages.sh@112 -- # get_nodes 00:05:04.604 13:17:23 -- setup/hugepages.sh@27 -- # local node 00:05:04.604 13:17:23 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:04.604 13:17:23 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:04.604 13:17:23 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:04.604 13:17:23 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:05:04.604 13:17:23 -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:04.604 13:17:23 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:04.604 13:17:23 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:04.604 13:17:23 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:04.604 13:17:23 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:04.604 13:17:23 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:04.604 13:17:23 -- setup/common.sh@18 -- # local node=0 00:05:04.604 13:17:23 -- setup/common.sh@19 -- # local var val 00:05:04.604 13:17:23 -- setup/common.sh@20 -- # local mem_f mem 00:05:04.604 13:17:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.604 13:17:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:04.604 13:17:23 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:04.604 13:17:23 -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.604 13:17:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116968 kB' 'MemFree: 40298520 kB' 'MemUsed: 7818448 kB' 'SwapCached: 0 kB' 'Active: 4549020 kB' 'Inactive: 285836 kB' 'Active(anon): 4131368 kB' 'Inactive(anon): 0 kB' 'Active(file): 417652 kB' 'Inactive(file): 285836 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4528572 kB' 'Mapped: 113296 kB' 'AnonPages: 309480 kB' 'Shmem: 3825084 kB' 'KernelStack: 8680 kB' 'PageTables: 5508 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 361092 kB' 'Slab: 599692 kB' 'SReclaimable: 361092 kB' 'SUnreclaim: 238600 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.604 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.604 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@33 -- # echo 0 00:05:04.605 13:17:23 -- setup/common.sh@33 -- # return 0 00:05:04.605 13:17:23 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:04.605 13:17:23 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:04.605 13:17:23 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:04.605 13:17:23 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:04.605 13:17:23 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:04.605 13:17:23 -- setup/common.sh@18 -- # local node=1 00:05:04.605 13:17:23 -- setup/common.sh@19 -- # local var val 00:05:04.605 13:17:23 -- setup/common.sh@20 -- # local mem_f mem 00:05:04.605 13:17:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.605 13:17:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:04.605 13:17:23 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:04.605 13:17:23 -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.605 13:17:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176560 kB' 'MemFree: 28229392 kB' 'MemUsed: 15947168 kB' 'SwapCached: 0 kB' 'Active: 10486936 kB' 'Inactive: 3445940 kB' 'Active(anon): 10346460 kB' 'Inactive(anon): 0 kB' 'Active(file): 140476 kB' 'Inactive(file): 3445940 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 13708512 kB' 'Mapped: 89092 kB' 'AnonPages: 224512 kB' 'Shmem: 10122096 kB' 'KernelStack: 7720 kB' 'PageTables: 3248 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 164996 kB' 'Slab: 347952 kB' 'SReclaimable: 164996 kB' 'SUnreclaim: 182956 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.605 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.605 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # continue 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # IFS=': ' 00:05:04.606 13:17:23 -- setup/common.sh@31 -- # read -r var val _ 00:05:04.606 13:17:23 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.606 13:17:23 -- setup/common.sh@33 -- # echo 0 00:05:04.606 13:17:23 -- setup/common.sh@33 -- # return 0 00:05:04.606 13:17:23 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:04.606 13:17:23 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:04.606 13:17:23 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:04.606 13:17:23 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:04.606 13:17:23 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:05:04.606 node0=512 expecting 513 00:05:04.606 13:17:23 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:04.606 13:17:23 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:04.606 13:17:23 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:04.606 13:17:23 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:05:04.606 node1=513 expecting 512 00:05:04.606 13:17:23 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:05:04.606 00:05:04.606 real 0m5.937s 00:05:04.606 user 0m2.042s 00:05:04.606 sys 0m3.953s 00:05:04.606 13:17:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:04.606 13:17:23 -- common/autotest_common.sh@10 -- # set +x 00:05:04.606 ************************************ 00:05:04.606 END TEST odd_alloc 00:05:04.606 ************************************ 00:05:04.606 13:17:23 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:05:04.606 13:17:23 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:04.606 13:17:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:04.606 13:17:23 -- common/autotest_common.sh@10 -- # set +x 00:05:04.606 ************************************ 00:05:04.606 START TEST custom_alloc 00:05:04.606 ************************************ 00:05:04.606 13:17:23 -- common/autotest_common.sh@1104 -- # custom_alloc 00:05:04.606 13:17:23 -- setup/hugepages.sh@167 -- # local IFS=, 00:05:04.606 13:17:23 -- setup/hugepages.sh@169 -- # local node 00:05:04.606 13:17:23 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:05:04.606 13:17:23 -- setup/hugepages.sh@170 -- # local nodes_hp 00:05:04.606 13:17:23 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:05:04.606 13:17:23 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:05:04.606 13:17:23 -- setup/hugepages.sh@49 -- # local size=1048576 00:05:04.606 13:17:23 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:04.606 13:17:23 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:04.606 13:17:23 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:05:04.606 13:17:23 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:04.606 13:17:23 -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:04.606 13:17:23 -- setup/hugepages.sh@62 -- # local user_nodes 00:05:04.606 13:17:23 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:05:04.606 13:17:23 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:04.606 13:17:23 -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:04.606 13:17:23 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:04.606 13:17:23 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:04.606 13:17:23 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:05:04.606 13:17:23 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:04.606 13:17:23 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:05:04.606 13:17:23 -- setup/hugepages.sh@83 -- # : 256 00:05:04.606 13:17:23 -- setup/hugepages.sh@84 -- # : 1 00:05:04.606 13:17:23 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:04.606 13:17:23 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:05:04.606 13:17:23 -- setup/hugepages.sh@83 -- # : 0 00:05:04.606 13:17:23 -- setup/hugepages.sh@84 -- # : 0 00:05:04.606 13:17:23 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:04.606 13:17:23 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:05:04.606 13:17:23 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:05:04.606 13:17:23 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:05:04.606 13:17:23 -- setup/hugepages.sh@49 -- # local size=2097152 00:05:04.606 13:17:23 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:04.606 13:17:23 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:04.606 13:17:23 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:04.606 13:17:23 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:04.606 13:17:23 -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:04.606 13:17:23 -- setup/hugepages.sh@62 -- # local user_nodes 00:05:04.606 13:17:23 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:04.606 13:17:23 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:04.606 13:17:23 -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:04.606 13:17:23 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:04.606 13:17:23 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:04.606 13:17:23 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:05:04.606 13:17:23 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:04.606 13:17:23 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:05:04.606 13:17:23 -- setup/hugepages.sh@78 -- # return 0 00:05:04.606 13:17:23 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:05:04.606 13:17:23 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:05:04.606 13:17:23 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:04.606 13:17:23 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:04.606 13:17:23 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:05:04.606 13:17:23 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:04.606 13:17:23 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:04.606 13:17:23 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:05:04.606 13:17:23 -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:04.606 13:17:23 -- setup/hugepages.sh@62 -- # local user_nodes 00:05:04.606 13:17:23 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:04.606 13:17:23 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:04.606 13:17:23 -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:04.606 13:17:23 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:04.606 13:17:23 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:04.606 13:17:23 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:05:04.607 13:17:23 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:04.607 13:17:23 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:05:04.607 13:17:23 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:04.607 13:17:23 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:05:04.607 13:17:23 -- setup/hugepages.sh@78 -- # return 0 00:05:04.607 13:17:23 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:05:04.607 13:17:23 -- setup/hugepages.sh@187 -- # setup output 00:05:04.607 13:17:23 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:04.607 13:17:23 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:08.797 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:08.797 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:08.797 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:08.797 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:08.797 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:08.797 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:08.797 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:08.797 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:08.797 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:08.797 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:08.797 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:08.797 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:08.797 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:08.797 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:08.797 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:08.797 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:08.797 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:10.724 13:17:29 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:05:10.724 13:17:29 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:05:10.724 13:17:29 -- setup/hugepages.sh@89 -- # local node 00:05:10.724 13:17:29 -- setup/hugepages.sh@90 -- # local sorted_t 00:05:10.724 13:17:29 -- setup/hugepages.sh@91 -- # local sorted_s 00:05:10.724 13:17:29 -- setup/hugepages.sh@92 -- # local surp 00:05:10.724 13:17:29 -- setup/hugepages.sh@93 -- # local resv 00:05:10.724 13:17:29 -- setup/hugepages.sh@94 -- # local anon 00:05:10.724 13:17:29 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:10.724 13:17:29 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:10.724 13:17:29 -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:10.724 13:17:29 -- setup/common.sh@18 -- # local node= 00:05:10.724 13:17:29 -- setup/common.sh@19 -- # local var val 00:05:10.724 13:17:29 -- setup/common.sh@20 -- # local mem_f mem 00:05:10.724 13:17:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.724 13:17:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.724 13:17:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.724 13:17:29 -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.724 13:17:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.724 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.724 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.724 13:17:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 67487740 kB' 'MemAvailable: 71467548 kB' 'Buffers: 9896 kB' 'Cached: 18227304 kB' 'SwapCached: 0 kB' 'Active: 15036652 kB' 'Inactive: 3731776 kB' 'Active(anon): 14478524 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534104 kB' 'Mapped: 202532 kB' 'Shmem: 13947296 kB' 'KReclaimable: 526088 kB' 'Slab: 948744 kB' 'SReclaimable: 526088 kB' 'SUnreclaim: 422656 kB' 'KernelStack: 16320 kB' 'PageTables: 8560 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962504 kB' 'Committed_AS: 15851480 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214312 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:05:10.724 13:17:29 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.724 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.724 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.724 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.724 13:17:29 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.724 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.724 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.725 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.725 13:17:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.725 13:17:29 -- setup/common.sh@33 -- # echo 0 00:05:10.725 13:17:29 -- setup/common.sh@33 -- # return 0 00:05:10.725 13:17:29 -- setup/hugepages.sh@97 -- # anon=0 00:05:10.726 13:17:29 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:10.726 13:17:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:10.726 13:17:29 -- setup/common.sh@18 -- # local node= 00:05:10.726 13:17:29 -- setup/common.sh@19 -- # local var val 00:05:10.726 13:17:29 -- setup/common.sh@20 -- # local mem_f mem 00:05:10.726 13:17:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.726 13:17:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.726 13:17:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.726 13:17:29 -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.726 13:17:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.726 13:17:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 67488220 kB' 'MemAvailable: 71468028 kB' 'Buffers: 9896 kB' 'Cached: 18227308 kB' 'SwapCached: 0 kB' 'Active: 15036812 kB' 'Inactive: 3731776 kB' 'Active(anon): 14478684 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534172 kB' 'Mapped: 202520 kB' 'Shmem: 13947300 kB' 'KReclaimable: 526088 kB' 'Slab: 948740 kB' 'SReclaimable: 526088 kB' 'SUnreclaim: 422652 kB' 'KernelStack: 16304 kB' 'PageTables: 8504 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962504 kB' 'Committed_AS: 15852248 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214280 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.726 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.726 13:17:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.727 13:17:29 -- setup/common.sh@33 -- # echo 0 00:05:10.727 13:17:29 -- setup/common.sh@33 -- # return 0 00:05:10.727 13:17:29 -- setup/hugepages.sh@99 -- # surp=0 00:05:10.727 13:17:29 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:10.727 13:17:29 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:10.727 13:17:29 -- setup/common.sh@18 -- # local node= 00:05:10.727 13:17:29 -- setup/common.sh@19 -- # local var val 00:05:10.727 13:17:29 -- setup/common.sh@20 -- # local mem_f mem 00:05:10.727 13:17:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.727 13:17:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.727 13:17:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.727 13:17:29 -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.727 13:17:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 67489076 kB' 'MemAvailable: 71468884 kB' 'Buffers: 9896 kB' 'Cached: 18227316 kB' 'SwapCached: 0 kB' 'Active: 15036248 kB' 'Inactive: 3731776 kB' 'Active(anon): 14478120 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534148 kB' 'Mapped: 202444 kB' 'Shmem: 13947308 kB' 'KReclaimable: 526088 kB' 'Slab: 948724 kB' 'SReclaimable: 526088 kB' 'SUnreclaim: 422636 kB' 'KernelStack: 16272 kB' 'PageTables: 8396 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962504 kB' 'Committed_AS: 15851504 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214232 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.727 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.727 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.728 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.728 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.729 13:17:29 -- setup/common.sh@33 -- # echo 0 00:05:10.729 13:17:29 -- setup/common.sh@33 -- # return 0 00:05:10.729 13:17:29 -- setup/hugepages.sh@100 -- # resv=0 00:05:10.729 13:17:29 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:05:10.729 nr_hugepages=1536 00:05:10.729 13:17:29 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:10.729 resv_hugepages=0 00:05:10.729 13:17:29 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:10.729 surplus_hugepages=0 00:05:10.729 13:17:29 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:10.729 anon_hugepages=0 00:05:10.729 13:17:29 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:10.729 13:17:29 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:05:10.729 13:17:29 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:10.729 13:17:29 -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:10.729 13:17:29 -- setup/common.sh@18 -- # local node= 00:05:10.729 13:17:29 -- setup/common.sh@19 -- # local var val 00:05:10.729 13:17:29 -- setup/common.sh@20 -- # local mem_f mem 00:05:10.729 13:17:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.729 13:17:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.729 13:17:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.729 13:17:29 -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.729 13:17:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 67489876 kB' 'MemAvailable: 71469684 kB' 'Buffers: 9896 kB' 'Cached: 18227336 kB' 'SwapCached: 0 kB' 'Active: 15036300 kB' 'Inactive: 3731776 kB' 'Active(anon): 14478172 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534160 kB' 'Mapped: 202444 kB' 'Shmem: 13947328 kB' 'KReclaimable: 526088 kB' 'Slab: 948724 kB' 'SReclaimable: 526088 kB' 'SUnreclaim: 422636 kB' 'KernelStack: 16304 kB' 'PageTables: 8504 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962504 kB' 'Committed_AS: 15851520 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214232 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.729 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.729 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.730 13:17:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.730 13:17:29 -- setup/common.sh@33 -- # echo 1536 00:05:10.730 13:17:29 -- setup/common.sh@33 -- # return 0 00:05:10.730 13:17:29 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:10.730 13:17:29 -- setup/hugepages.sh@112 -- # get_nodes 00:05:10.730 13:17:29 -- setup/hugepages.sh@27 -- # local node 00:05:10.730 13:17:29 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:10.730 13:17:29 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:10.730 13:17:29 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:10.730 13:17:29 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:10.730 13:17:29 -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:10.730 13:17:29 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:10.730 13:17:29 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:10.730 13:17:29 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:10.730 13:17:29 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:10.730 13:17:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:10.730 13:17:29 -- setup/common.sh@18 -- # local node=0 00:05:10.730 13:17:29 -- setup/common.sh@19 -- # local var val 00:05:10.730 13:17:29 -- setup/common.sh@20 -- # local mem_f mem 00:05:10.730 13:17:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.730 13:17:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:10.730 13:17:29 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:10.730 13:17:29 -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.730 13:17:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.730 13:17:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116968 kB' 'MemFree: 40287680 kB' 'MemUsed: 7829288 kB' 'SwapCached: 0 kB' 'Active: 4549228 kB' 'Inactive: 285836 kB' 'Active(anon): 4131576 kB' 'Inactive(anon): 0 kB' 'Active(file): 417652 kB' 'Inactive(file): 285836 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4528700 kB' 'Mapped: 113292 kB' 'AnonPages: 309568 kB' 'Shmem: 3825212 kB' 'KernelStack: 8648 kB' 'PageTables: 5364 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 361092 kB' 'Slab: 599928 kB' 'SReclaimable: 361092 kB' 'SUnreclaim: 238836 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.730 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.731 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.731 13:17:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.731 13:17:29 -- setup/common.sh@33 -- # echo 0 00:05:10.731 13:17:29 -- setup/common.sh@33 -- # return 0 00:05:10.731 13:17:29 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:10.731 13:17:29 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:10.731 13:17:29 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:10.731 13:17:29 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:10.731 13:17:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:10.731 13:17:29 -- setup/common.sh@18 -- # local node=1 00:05:10.731 13:17:29 -- setup/common.sh@19 -- # local var val 00:05:10.731 13:17:29 -- setup/common.sh@20 -- # local mem_f mem 00:05:10.731 13:17:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.731 13:17:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:10.732 13:17:29 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:10.732 13:17:29 -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.732 13:17:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176560 kB' 'MemFree: 27202448 kB' 'MemUsed: 16974112 kB' 'SwapCached: 0 kB' 'Active: 10487104 kB' 'Inactive: 3445940 kB' 'Active(anon): 10346628 kB' 'Inactive(anon): 0 kB' 'Active(file): 140476 kB' 'Inactive(file): 3445940 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 13708544 kB' 'Mapped: 89152 kB' 'AnonPages: 224592 kB' 'Shmem: 10122128 kB' 'KernelStack: 7656 kB' 'PageTables: 3140 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 164996 kB' 'Slab: 348796 kB' 'SReclaimable: 164996 kB' 'SUnreclaim: 183800 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.732 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.732 13:17:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.733 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.733 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.733 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.733 13:17:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.733 13:17:29 -- setup/common.sh@32 -- # continue 00:05:10.733 13:17:29 -- setup/common.sh@31 -- # IFS=': ' 00:05:10.733 13:17:29 -- setup/common.sh@31 -- # read -r var val _ 00:05:10.733 13:17:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.733 13:17:29 -- setup/common.sh@33 -- # echo 0 00:05:10.733 13:17:29 -- setup/common.sh@33 -- # return 0 00:05:10.733 13:17:29 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:10.733 13:17:29 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:10.733 13:17:29 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:10.733 13:17:29 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:10.733 13:17:29 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:05:10.733 node0=512 expecting 512 00:05:10.733 13:17:29 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:10.733 13:17:29 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:10.733 13:17:29 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:10.733 13:17:29 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:05:10.733 node1=1024 expecting 1024 00:05:10.733 13:17:29 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:05:10.733 00:05:10.733 real 0m6.181s 00:05:10.733 user 0m2.208s 00:05:10.733 sys 0m4.051s 00:05:10.733 13:17:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:10.733 13:17:29 -- common/autotest_common.sh@10 -- # set +x 00:05:10.733 ************************************ 00:05:10.733 END TEST custom_alloc 00:05:10.733 ************************************ 00:05:10.733 13:17:29 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:05:10.733 13:17:29 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:10.733 13:17:29 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:10.733 13:17:29 -- common/autotest_common.sh@10 -- # set +x 00:05:10.733 ************************************ 00:05:10.733 START TEST no_shrink_alloc 00:05:10.733 ************************************ 00:05:10.733 13:17:29 -- common/autotest_common.sh@1104 -- # no_shrink_alloc 00:05:10.733 13:17:29 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:05:10.733 13:17:29 -- setup/hugepages.sh@49 -- # local size=2097152 00:05:10.733 13:17:29 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:05:10.733 13:17:29 -- setup/hugepages.sh@51 -- # shift 00:05:10.733 13:17:29 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:05:10.733 13:17:29 -- setup/hugepages.sh@52 -- # local node_ids 00:05:10.733 13:17:29 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:10.733 13:17:29 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:10.733 13:17:29 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:05:10.733 13:17:29 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:05:10.733 13:17:29 -- setup/hugepages.sh@62 -- # local user_nodes 00:05:10.733 13:17:29 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:10.733 13:17:29 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:10.733 13:17:29 -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:10.733 13:17:29 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:10.733 13:17:29 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:05:10.733 13:17:29 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:10.733 13:17:29 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:05:10.733 13:17:29 -- setup/hugepages.sh@73 -- # return 0 00:05:10.733 13:17:29 -- setup/hugepages.sh@198 -- # setup output 00:05:10.733 13:17:29 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:10.733 13:17:29 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:14.926 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:14.926 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:14.926 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:14.926 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:14.926 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:14.926 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:14.926 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:14.926 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:14.926 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:14.926 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:14.926 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:14.926 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:14.926 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:14.926 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:14.926 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:14.926 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:14.926 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:16.868 13:17:35 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:05:16.868 13:17:35 -- setup/hugepages.sh@89 -- # local node 00:05:16.868 13:17:35 -- setup/hugepages.sh@90 -- # local sorted_t 00:05:16.868 13:17:35 -- setup/hugepages.sh@91 -- # local sorted_s 00:05:16.868 13:17:35 -- setup/hugepages.sh@92 -- # local surp 00:05:16.868 13:17:35 -- setup/hugepages.sh@93 -- # local resv 00:05:16.868 13:17:35 -- setup/hugepages.sh@94 -- # local anon 00:05:16.868 13:17:35 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:16.868 13:17:35 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:16.868 13:17:35 -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:16.868 13:17:35 -- setup/common.sh@18 -- # local node= 00:05:16.868 13:17:35 -- setup/common.sh@19 -- # local var val 00:05:16.868 13:17:35 -- setup/common.sh@20 -- # local mem_f mem 00:05:16.868 13:17:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:16.868 13:17:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:16.868 13:17:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:16.868 13:17:35 -- setup/common.sh@28 -- # mapfile -t mem 00:05:16.868 13:17:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68520948 kB' 'MemAvailable: 72500724 kB' 'Buffers: 9896 kB' 'Cached: 18227464 kB' 'SwapCached: 0 kB' 'Active: 15038504 kB' 'Inactive: 3731776 kB' 'Active(anon): 14480376 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 535852 kB' 'Mapped: 202628 kB' 'Shmem: 13947456 kB' 'KReclaimable: 526056 kB' 'Slab: 949108 kB' 'SReclaimable: 526056 kB' 'SUnreclaim: 423052 kB' 'KernelStack: 16416 kB' 'PageTables: 8744 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 15852296 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214280 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.868 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.868 13:17:35 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:16.869 13:17:35 -- setup/common.sh@33 -- # echo 0 00:05:16.869 13:17:35 -- setup/common.sh@33 -- # return 0 00:05:16.869 13:17:35 -- setup/hugepages.sh@97 -- # anon=0 00:05:16.869 13:17:35 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:16.869 13:17:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:16.869 13:17:35 -- setup/common.sh@18 -- # local node= 00:05:16.869 13:17:35 -- setup/common.sh@19 -- # local var val 00:05:16.869 13:17:35 -- setup/common.sh@20 -- # local mem_f mem 00:05:16.869 13:17:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:16.869 13:17:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:16.869 13:17:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:16.869 13:17:35 -- setup/common.sh@28 -- # mapfile -t mem 00:05:16.869 13:17:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68520728 kB' 'MemAvailable: 72500504 kB' 'Buffers: 9896 kB' 'Cached: 18227464 kB' 'SwapCached: 0 kB' 'Active: 15038640 kB' 'Inactive: 3731776 kB' 'Active(anon): 14480512 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 535924 kB' 'Mapped: 202628 kB' 'Shmem: 13947456 kB' 'KReclaimable: 526056 kB' 'Slab: 949096 kB' 'SReclaimable: 526056 kB' 'SUnreclaim: 423040 kB' 'KernelStack: 16384 kB' 'PageTables: 8644 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 15852308 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214248 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.869 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.869 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.870 13:17:35 -- setup/common.sh@33 -- # echo 0 00:05:16.870 13:17:35 -- setup/common.sh@33 -- # return 0 00:05:16.870 13:17:35 -- setup/hugepages.sh@99 -- # surp=0 00:05:16.870 13:17:35 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:16.870 13:17:35 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:16.870 13:17:35 -- setup/common.sh@18 -- # local node= 00:05:16.870 13:17:35 -- setup/common.sh@19 -- # local var val 00:05:16.870 13:17:35 -- setup/common.sh@20 -- # local mem_f mem 00:05:16.870 13:17:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:16.870 13:17:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:16.870 13:17:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:16.870 13:17:35 -- setup/common.sh@28 -- # mapfile -t mem 00:05:16.870 13:17:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:16.870 13:17:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68520476 kB' 'MemAvailable: 72500252 kB' 'Buffers: 9896 kB' 'Cached: 18227480 kB' 'SwapCached: 0 kB' 'Active: 15038036 kB' 'Inactive: 3731776 kB' 'Active(anon): 14479908 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 535772 kB' 'Mapped: 202504 kB' 'Shmem: 13947472 kB' 'KReclaimable: 526056 kB' 'Slab: 949124 kB' 'SReclaimable: 526056 kB' 'SUnreclaim: 423068 kB' 'KernelStack: 16384 kB' 'PageTables: 8636 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 15852320 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214248 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.870 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.870 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.871 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.871 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:16.872 13:17:35 -- setup/common.sh@33 -- # echo 0 00:05:16.872 13:17:35 -- setup/common.sh@33 -- # return 0 00:05:16.872 13:17:35 -- setup/hugepages.sh@100 -- # resv=0 00:05:16.872 13:17:35 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:16.872 nr_hugepages=1024 00:05:16.872 13:17:35 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:16.872 resv_hugepages=0 00:05:16.872 13:17:35 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:16.872 surplus_hugepages=0 00:05:16.872 13:17:35 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:16.872 anon_hugepages=0 00:05:16.872 13:17:35 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:16.872 13:17:35 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:16.872 13:17:35 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:16.872 13:17:35 -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:16.872 13:17:35 -- setup/common.sh@18 -- # local node= 00:05:16.872 13:17:35 -- setup/common.sh@19 -- # local var val 00:05:16.872 13:17:35 -- setup/common.sh@20 -- # local mem_f mem 00:05:16.872 13:17:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:16.872 13:17:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:16.872 13:17:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:16.872 13:17:35 -- setup/common.sh@28 -- # mapfile -t mem 00:05:16.872 13:17:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68520476 kB' 'MemAvailable: 72500252 kB' 'Buffers: 9896 kB' 'Cached: 18227496 kB' 'SwapCached: 0 kB' 'Active: 15038092 kB' 'Inactive: 3731776 kB' 'Active(anon): 14479964 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 535296 kB' 'Mapped: 202504 kB' 'Shmem: 13947488 kB' 'KReclaimable: 526056 kB' 'Slab: 949124 kB' 'SReclaimable: 526056 kB' 'SUnreclaim: 423068 kB' 'KernelStack: 16368 kB' 'PageTables: 8580 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 15863580 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214264 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.872 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.872 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.873 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.873 13:17:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:16.873 13:17:35 -- setup/common.sh@33 -- # echo 1024 00:05:16.873 13:17:35 -- setup/common.sh@33 -- # return 0 00:05:16.873 13:17:35 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:16.873 13:17:35 -- setup/hugepages.sh@112 -- # get_nodes 00:05:16.873 13:17:35 -- setup/hugepages.sh@27 -- # local node 00:05:16.873 13:17:35 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:16.873 13:17:35 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:16.873 13:17:35 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:16.873 13:17:35 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:16.873 13:17:35 -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:16.873 13:17:35 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:16.873 13:17:35 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:16.873 13:17:35 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:16.873 13:17:35 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:16.873 13:17:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:16.874 13:17:35 -- setup/common.sh@18 -- # local node=0 00:05:16.874 13:17:35 -- setup/common.sh@19 -- # local var val 00:05:16.874 13:17:35 -- setup/common.sh@20 -- # local mem_f mem 00:05:16.874 13:17:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:16.874 13:17:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:16.874 13:17:35 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:16.874 13:17:35 -- setup/common.sh@28 -- # mapfile -t mem 00:05:16.874 13:17:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116968 kB' 'MemFree: 39234844 kB' 'MemUsed: 8882124 kB' 'SwapCached: 0 kB' 'Active: 4550120 kB' 'Inactive: 285836 kB' 'Active(anon): 4132468 kB' 'Inactive(anon): 0 kB' 'Active(file): 417652 kB' 'Inactive(file): 285836 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4528796 kB' 'Mapped: 113292 kB' 'AnonPages: 310356 kB' 'Shmem: 3825308 kB' 'KernelStack: 8744 kB' 'PageTables: 5596 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 361060 kB' 'Slab: 600784 kB' 'SReclaimable: 361060 kB' 'SUnreclaim: 239724 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:16.874 13:17:35 -- setup/common.sh@32 -- # continue 00:05:16.874 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.136 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.136 13:17:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.136 13:17:35 -- setup/common.sh@32 -- # continue 00:05:17.136 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.136 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.136 13:17:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.136 13:17:35 -- setup/common.sh@32 -- # continue 00:05:17.136 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.136 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.136 13:17:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.136 13:17:35 -- setup/common.sh@32 -- # continue 00:05:17.136 13:17:35 -- setup/common.sh@31 -- # IFS=': ' 00:05:17.136 13:17:35 -- setup/common.sh@31 -- # read -r var val _ 00:05:17.136 13:17:35 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.136 13:17:35 -- setup/common.sh@33 -- # echo 0 00:05:17.136 13:17:35 -- setup/common.sh@33 -- # return 0 00:05:17.136 13:17:35 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:17.136 13:17:35 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:17.136 13:17:35 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:17.136 13:17:35 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:17.136 13:17:35 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:17.136 node0=1024 expecting 1024 00:05:17.136 13:17:35 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:17.136 13:17:35 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:05:17.136 13:17:35 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:05:17.136 13:17:35 -- setup/hugepages.sh@202 -- # setup output 00:05:17.136 13:17:35 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:17.136 13:17:35 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:21.331 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:21.331 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:21.331 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:21.331 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:21.331 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:21.331 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:21.331 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:21.331 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:21.331 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:21.331 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:21.331 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:21.331 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:21.331 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:21.331 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:21.331 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:21.331 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:21.331 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:23.238 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:23.238 13:17:41 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:05:23.238 13:17:41 -- setup/hugepages.sh@89 -- # local node 00:05:23.238 13:17:41 -- setup/hugepages.sh@90 -- # local sorted_t 00:05:23.238 13:17:41 -- setup/hugepages.sh@91 -- # local sorted_s 00:05:23.238 13:17:41 -- setup/hugepages.sh@92 -- # local surp 00:05:23.238 13:17:41 -- setup/hugepages.sh@93 -- # local resv 00:05:23.238 13:17:41 -- setup/hugepages.sh@94 -- # local anon 00:05:23.238 13:17:41 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:23.238 13:17:41 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:23.238 13:17:41 -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:23.238 13:17:41 -- setup/common.sh@18 -- # local node= 00:05:23.238 13:17:41 -- setup/common.sh@19 -- # local var val 00:05:23.238 13:17:41 -- setup/common.sh@20 -- # local mem_f mem 00:05:23.238 13:17:41 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:23.238 13:17:41 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:23.238 13:17:41 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:23.238 13:17:41 -- setup/common.sh@28 -- # mapfile -t mem 00:05:23.238 13:17:41 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.238 13:17:41 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68509280 kB' 'MemAvailable: 72489040 kB' 'Buffers: 9896 kB' 'Cached: 18227620 kB' 'SwapCached: 0 kB' 'Active: 15039904 kB' 'Inactive: 3731776 kB' 'Active(anon): 14481776 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 537648 kB' 'Mapped: 202616 kB' 'Shmem: 13947612 kB' 'KReclaimable: 526040 kB' 'Slab: 949240 kB' 'SReclaimable: 526040 kB' 'SUnreclaim: 423200 kB' 'KernelStack: 16496 kB' 'PageTables: 8948 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 15853096 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214376 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.238 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.238 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:23.239 13:17:41 -- setup/common.sh@33 -- # echo 0 00:05:23.239 13:17:41 -- setup/common.sh@33 -- # return 0 00:05:23.239 13:17:41 -- setup/hugepages.sh@97 -- # anon=0 00:05:23.239 13:17:41 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:23.239 13:17:41 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:23.239 13:17:41 -- setup/common.sh@18 -- # local node= 00:05:23.239 13:17:41 -- setup/common.sh@19 -- # local var val 00:05:23.239 13:17:41 -- setup/common.sh@20 -- # local mem_f mem 00:05:23.239 13:17:41 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:23.239 13:17:41 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:23.239 13:17:41 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:23.239 13:17:41 -- setup/common.sh@28 -- # mapfile -t mem 00:05:23.239 13:17:41 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68513124 kB' 'MemAvailable: 72492884 kB' 'Buffers: 9896 kB' 'Cached: 18227620 kB' 'SwapCached: 0 kB' 'Active: 15039500 kB' 'Inactive: 3731776 kB' 'Active(anon): 14481372 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 537128 kB' 'Mapped: 202568 kB' 'Shmem: 13947612 kB' 'KReclaimable: 526040 kB' 'Slab: 949232 kB' 'SReclaimable: 526040 kB' 'SUnreclaim: 423192 kB' 'KernelStack: 16464 kB' 'PageTables: 8836 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 15853108 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214312 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.239 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.239 13:17:41 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.240 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.240 13:17:41 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.241 13:17:41 -- setup/common.sh@33 -- # echo 0 00:05:23.241 13:17:41 -- setup/common.sh@33 -- # return 0 00:05:23.241 13:17:41 -- setup/hugepages.sh@99 -- # surp=0 00:05:23.241 13:17:41 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:23.241 13:17:41 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:23.241 13:17:41 -- setup/common.sh@18 -- # local node= 00:05:23.241 13:17:41 -- setup/common.sh@19 -- # local var val 00:05:23.241 13:17:41 -- setup/common.sh@20 -- # local mem_f mem 00:05:23.241 13:17:41 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:23.241 13:17:41 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:23.241 13:17:41 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:23.241 13:17:41 -- setup/common.sh@28 -- # mapfile -t mem 00:05:23.241 13:17:41 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68512908 kB' 'MemAvailable: 72492668 kB' 'Buffers: 9896 kB' 'Cached: 18227636 kB' 'SwapCached: 0 kB' 'Active: 15039468 kB' 'Inactive: 3731776 kB' 'Active(anon): 14481340 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 537104 kB' 'Mapped: 202568 kB' 'Shmem: 13947628 kB' 'KReclaimable: 526040 kB' 'Slab: 949248 kB' 'SReclaimable: 526040 kB' 'SUnreclaim: 423208 kB' 'KernelStack: 16464 kB' 'PageTables: 8856 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 15853120 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214328 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.241 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.241 13:17:41 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:23.242 13:17:41 -- setup/common.sh@33 -- # echo 0 00:05:23.242 13:17:41 -- setup/common.sh@33 -- # return 0 00:05:23.242 13:17:41 -- setup/hugepages.sh@100 -- # resv=0 00:05:23.242 13:17:41 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:23.242 nr_hugepages=1024 00:05:23.242 13:17:41 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:23.242 resv_hugepages=0 00:05:23.242 13:17:41 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:23.242 surplus_hugepages=0 00:05:23.242 13:17:41 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:23.242 anon_hugepages=0 00:05:23.242 13:17:41 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:23.242 13:17:41 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:23.242 13:17:41 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:23.242 13:17:41 -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:23.242 13:17:41 -- setup/common.sh@18 -- # local node= 00:05:23.242 13:17:41 -- setup/common.sh@19 -- # local var val 00:05:23.242 13:17:41 -- setup/common.sh@20 -- # local mem_f mem 00:05:23.242 13:17:41 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:23.242 13:17:41 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:23.242 13:17:41 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:23.242 13:17:41 -- setup/common.sh@28 -- # mapfile -t mem 00:05:23.242 13:17:41 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 68512908 kB' 'MemAvailable: 72492668 kB' 'Buffers: 9896 kB' 'Cached: 18227636 kB' 'SwapCached: 0 kB' 'Active: 15039504 kB' 'Inactive: 3731776 kB' 'Active(anon): 14481376 kB' 'Inactive(anon): 0 kB' 'Active(file): 558128 kB' 'Inactive(file): 3731776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 537140 kB' 'Mapped: 202568 kB' 'Shmem: 13947628 kB' 'KReclaimable: 526040 kB' 'Slab: 949248 kB' 'SReclaimable: 526040 kB' 'SUnreclaim: 423208 kB' 'KernelStack: 16480 kB' 'PageTables: 8908 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 15853136 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214344 kB' 'VmallocChunk: 0 kB' 'Percpu: 76800 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1342912 kB' 'DirectMap2M: 30838784 kB' 'DirectMap1G: 69206016 kB' 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.242 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.242 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.243 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.243 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:23.244 13:17:41 -- setup/common.sh@33 -- # echo 1024 00:05:23.244 13:17:41 -- setup/common.sh@33 -- # return 0 00:05:23.244 13:17:41 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:23.244 13:17:41 -- setup/hugepages.sh@112 -- # get_nodes 00:05:23.244 13:17:41 -- setup/hugepages.sh@27 -- # local node 00:05:23.244 13:17:41 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:23.244 13:17:41 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:23.244 13:17:41 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:23.244 13:17:41 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:23.244 13:17:41 -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:23.244 13:17:41 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:23.244 13:17:41 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:23.244 13:17:41 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:23.244 13:17:41 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:23.244 13:17:41 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:23.244 13:17:41 -- setup/common.sh@18 -- # local node=0 00:05:23.244 13:17:41 -- setup/common.sh@19 -- # local var val 00:05:23.244 13:17:41 -- setup/common.sh@20 -- # local mem_f mem 00:05:23.244 13:17:41 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:23.244 13:17:41 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:23.244 13:17:41 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:23.244 13:17:41 -- setup/common.sh@28 -- # mapfile -t mem 00:05:23.244 13:17:41 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116968 kB' 'MemFree: 39229360 kB' 'MemUsed: 8887608 kB' 'SwapCached: 0 kB' 'Active: 4549684 kB' 'Inactive: 285836 kB' 'Active(anon): 4132032 kB' 'Inactive(anon): 0 kB' 'Active(file): 417652 kB' 'Inactive(file): 285836 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4528876 kB' 'Mapped: 113292 kB' 'AnonPages: 309836 kB' 'Shmem: 3825388 kB' 'KernelStack: 8696 kB' 'PageTables: 5456 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 361012 kB' 'Slab: 600440 kB' 'SReclaimable: 361012 kB' 'SUnreclaim: 239428 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.244 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.244 13:17:41 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # continue 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # IFS=': ' 00:05:23.245 13:17:41 -- setup/common.sh@31 -- # read -r var val _ 00:05:23.245 13:17:41 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:23.245 13:17:41 -- setup/common.sh@33 -- # echo 0 00:05:23.245 13:17:41 -- setup/common.sh@33 -- # return 0 00:05:23.245 13:17:41 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:23.245 13:17:41 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:23.245 13:17:41 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:23.245 13:17:41 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:23.245 13:17:41 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:23.245 node0=1024 expecting 1024 00:05:23.245 13:17:41 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:23.245 00:05:23.245 real 0m12.253s 00:05:23.245 user 0m4.309s 00:05:23.245 sys 0m8.034s 00:05:23.245 13:17:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:23.245 13:17:41 -- common/autotest_common.sh@10 -- # set +x 00:05:23.245 ************************************ 00:05:23.245 END TEST no_shrink_alloc 00:05:23.245 ************************************ 00:05:23.245 13:17:41 -- setup/hugepages.sh@217 -- # clear_hp 00:05:23.245 13:17:41 -- setup/hugepages.sh@37 -- # local node hp 00:05:23.245 13:17:41 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:23.245 13:17:41 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:23.245 13:17:41 -- setup/hugepages.sh@41 -- # echo 0 00:05:23.245 13:17:41 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:23.245 13:17:41 -- setup/hugepages.sh@41 -- # echo 0 00:05:23.245 13:17:41 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:23.245 13:17:41 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:23.245 13:17:41 -- setup/hugepages.sh@41 -- # echo 0 00:05:23.245 13:17:41 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:23.245 13:17:41 -- setup/hugepages.sh@41 -- # echo 0 00:05:23.245 13:17:41 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:05:23.245 13:17:41 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:05:23.245 00:05:23.245 real 0m45.905s 00:05:23.245 user 0m14.786s 00:05:23.245 sys 0m28.461s 00:05:23.245 13:17:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:23.245 13:17:41 -- common/autotest_common.sh@10 -- # set +x 00:05:23.245 ************************************ 00:05:23.245 END TEST hugepages 00:05:23.245 ************************************ 00:05:23.245 13:17:41 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:23.245 13:17:41 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:23.245 13:17:41 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:23.245 13:17:41 -- common/autotest_common.sh@10 -- # set +x 00:05:23.245 ************************************ 00:05:23.245 START TEST driver 00:05:23.245 ************************************ 00:05:23.245 13:17:41 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:23.245 * Looking for test storage... 00:05:23.245 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:23.245 13:17:42 -- setup/driver.sh@68 -- # setup reset 00:05:23.245 13:17:42 -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:23.245 13:17:42 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:31.428 13:17:49 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:31.428 13:17:49 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:31.428 13:17:49 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:31.428 13:17:49 -- common/autotest_common.sh@10 -- # set +x 00:05:31.428 ************************************ 00:05:31.428 START TEST guess_driver 00:05:31.428 ************************************ 00:05:31.428 13:17:49 -- common/autotest_common.sh@1104 -- # guess_driver 00:05:31.428 13:17:49 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:31.428 13:17:49 -- setup/driver.sh@47 -- # local fail=0 00:05:31.428 13:17:49 -- setup/driver.sh@49 -- # pick_driver 00:05:31.428 13:17:49 -- setup/driver.sh@36 -- # vfio 00:05:31.428 13:17:49 -- setup/driver.sh@21 -- # local iommu_grups 00:05:31.428 13:17:49 -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:31.428 13:17:49 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:31.428 13:17:49 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:31.428 13:17:49 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:31.428 13:17:49 -- setup/driver.sh@29 -- # (( 238 > 0 )) 00:05:31.428 13:17:49 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:31.428 13:17:49 -- setup/driver.sh@14 -- # mod vfio_pci 00:05:31.428 13:17:49 -- setup/driver.sh@12 -- # dep vfio_pci 00:05:31.428 13:17:49 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:31.428 13:17:49 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:31.428 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:31.428 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:31.428 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:31.428 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:31.428 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:31.428 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:31.428 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:31.428 13:17:49 -- setup/driver.sh@30 -- # return 0 00:05:31.428 13:17:49 -- setup/driver.sh@37 -- # echo vfio-pci 00:05:31.428 13:17:49 -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:31.428 13:17:49 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:31.428 13:17:49 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:31.428 Looking for driver=vfio-pci 00:05:31.428 13:17:49 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:31.428 13:17:49 -- setup/driver.sh@45 -- # setup output config 00:05:31.428 13:17:49 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:31.428 13:17:49 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:34.717 13:17:53 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:34.717 13:17:53 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:34.717 13:17:53 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:34.717 13:17:53 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:34.717 13:17:53 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:34.717 13:17:53 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:34.717 13:17:53 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:34.717 13:17:53 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:34.717 13:17:53 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:34.717 13:17:53 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:34.717 13:17:53 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:34.717 13:17:53 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:34.717 13:17:53 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:34.717 13:17:53 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:34.717 13:17:53 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:34.717 13:17:53 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:34.717 13:17:53 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:34.717 13:17:53 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:34.717 13:17:53 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:34.717 13:17:53 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:34.717 13:17:53 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:34.717 13:17:53 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:34.717 13:17:53 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:34.717 13:17:53 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:34.717 13:17:53 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:34.717 13:17:53 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:34.717 13:17:53 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:34.717 13:17:53 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:34.717 13:17:53 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:34.717 13:17:53 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:34.717 13:17:53 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:34.717 13:17:53 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:34.717 13:17:53 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:34.717 13:17:53 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:34.717 13:17:53 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:34.717 13:17:53 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:34.717 13:17:53 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:34.717 13:17:53 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:34.717 13:17:53 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:34.717 13:17:53 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:34.717 13:17:53 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:34.717 13:17:53 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:34.717 13:17:53 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:34.717 13:17:53 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:34.717 13:17:53 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:34.717 13:17:53 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:34.717 13:17:53 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:34.717 13:17:53 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:38.006 13:17:56 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:38.006 13:17:56 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:38.006 13:17:56 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:39.913 13:17:58 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:39.913 13:17:58 -- setup/driver.sh@65 -- # setup reset 00:05:39.913 13:17:58 -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:39.913 13:17:58 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:48.083 00:05:48.083 real 0m16.267s 00:05:48.083 user 0m3.954s 00:05:48.083 sys 0m8.392s 00:05:48.083 13:18:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:48.083 13:18:05 -- common/autotest_common.sh@10 -- # set +x 00:05:48.083 ************************************ 00:05:48.083 END TEST guess_driver 00:05:48.083 ************************************ 00:05:48.083 00:05:48.083 real 0m23.902s 00:05:48.083 user 0m6.215s 00:05:48.083 sys 0m12.951s 00:05:48.083 13:18:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:48.083 13:18:05 -- common/autotest_common.sh@10 -- # set +x 00:05:48.083 ************************************ 00:05:48.083 END TEST driver 00:05:48.083 ************************************ 00:05:48.083 13:18:05 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:48.083 13:18:05 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:48.083 13:18:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:48.083 13:18:05 -- common/autotest_common.sh@10 -- # set +x 00:05:48.083 ************************************ 00:05:48.083 START TEST devices 00:05:48.083 ************************************ 00:05:48.083 13:18:05 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:48.083 * Looking for test storage... 00:05:48.083 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:48.083 13:18:05 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:48.083 13:18:05 -- setup/devices.sh@192 -- # setup reset 00:05:48.083 13:18:05 -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:48.083 13:18:05 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:54.658 13:18:12 -- setup/devices.sh@194 -- # get_zoned_devs 00:05:54.658 13:18:12 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:05:54.659 13:18:12 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:05:54.659 13:18:12 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:05:54.659 13:18:12 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:05:54.659 13:18:12 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:05:54.659 13:18:12 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:05:54.659 13:18:12 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:54.659 13:18:12 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:05:54.659 13:18:12 -- setup/devices.sh@196 -- # blocks=() 00:05:54.659 13:18:12 -- setup/devices.sh@196 -- # declare -a blocks 00:05:54.659 13:18:12 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:54.659 13:18:12 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:54.659 13:18:12 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:54.659 13:18:12 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:54.659 13:18:12 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:54.659 13:18:12 -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:54.659 13:18:12 -- setup/devices.sh@202 -- # pci=0000:1a:00.0 00:05:54.659 13:18:12 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\1\a\:\0\0\.\0* ]] 00:05:54.659 13:18:12 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:54.659 13:18:12 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:05:54.659 13:18:12 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:54.659 No valid GPT data, bailing 00:05:54.659 13:18:12 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:54.659 13:18:12 -- scripts/common.sh@393 -- # pt= 00:05:54.659 13:18:12 -- scripts/common.sh@394 -- # return 1 00:05:54.659 13:18:12 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:54.659 13:18:12 -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:54.659 13:18:12 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:54.659 13:18:12 -- setup/common.sh@80 -- # echo 4000787030016 00:05:54.659 13:18:12 -- setup/devices.sh@204 -- # (( 4000787030016 >= min_disk_size )) 00:05:54.659 13:18:12 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:54.659 13:18:12 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:1a:00.0 00:05:54.659 13:18:12 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:54.659 13:18:12 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:54.659 13:18:12 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:54.659 13:18:12 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:54.659 13:18:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:54.659 13:18:12 -- common/autotest_common.sh@10 -- # set +x 00:05:54.659 ************************************ 00:05:54.659 START TEST nvme_mount 00:05:54.659 ************************************ 00:05:54.659 13:18:12 -- common/autotest_common.sh@1104 -- # nvme_mount 00:05:54.659 13:18:12 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:54.659 13:18:12 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:54.659 13:18:12 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:54.659 13:18:12 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:54.659 13:18:12 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:54.659 13:18:12 -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:54.659 13:18:12 -- setup/common.sh@40 -- # local part_no=1 00:05:54.659 13:18:12 -- setup/common.sh@41 -- # local size=1073741824 00:05:54.659 13:18:12 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:54.659 13:18:12 -- setup/common.sh@44 -- # parts=() 00:05:54.659 13:18:12 -- setup/common.sh@44 -- # local parts 00:05:54.659 13:18:12 -- setup/common.sh@46 -- # (( part = 1 )) 00:05:54.659 13:18:12 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:54.659 13:18:12 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:54.659 13:18:12 -- setup/common.sh@46 -- # (( part++ )) 00:05:54.659 13:18:12 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:54.659 13:18:12 -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:54.659 13:18:12 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:54.659 13:18:12 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:54.918 Creating new GPT entries in memory. 00:05:54.918 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:54.918 other utilities. 00:05:54.918 13:18:13 -- setup/common.sh@57 -- # (( part = 1 )) 00:05:54.918 13:18:13 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:54.918 13:18:13 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:54.918 13:18:13 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:54.918 13:18:13 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:55.855 Creating new GPT entries in memory. 00:05:55.855 The operation has completed successfully. 00:05:55.855 13:18:14 -- setup/common.sh@57 -- # (( part++ )) 00:05:55.855 13:18:14 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:55.855 13:18:14 -- setup/common.sh@62 -- # wait 3129227 00:05:55.855 13:18:14 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:55.855 13:18:14 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:55.855 13:18:14 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:55.855 13:18:14 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:55.855 13:18:14 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:55.855 13:18:14 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:55.855 13:18:14 -- setup/devices.sh@105 -- # verify 0000:1a:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:55.855 13:18:14 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:05:55.855 13:18:14 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:55.855 13:18:14 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:55.855 13:18:14 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:55.855 13:18:14 -- setup/devices.sh@53 -- # local found=0 00:05:55.855 13:18:14 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:55.855 13:18:14 -- setup/devices.sh@56 -- # : 00:05:55.855 13:18:14 -- setup/devices.sh@59 -- # local pci status 00:05:55.855 13:18:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.855 13:18:14 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:05:55.855 13:18:14 -- setup/devices.sh@47 -- # setup output config 00:05:55.855 13:18:14 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:55.855 13:18:14 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:00.046 13:18:18 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:00.046 13:18:18 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:06:00.046 13:18:18 -- setup/devices.sh@63 -- # found=1 00:06:00.046 13:18:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.046 13:18:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:00.046 13:18:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.046 13:18:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:00.046 13:18:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.046 13:18:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:00.046 13:18:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.046 13:18:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:00.046 13:18:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.046 13:18:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:00.046 13:18:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.046 13:18:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:00.046 13:18:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.046 13:18:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:00.046 13:18:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.046 13:18:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:00.046 13:18:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.046 13:18:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:00.046 13:18:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.046 13:18:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:00.046 13:18:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.046 13:18:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:00.046 13:18:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.046 13:18:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:00.046 13:18:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.046 13:18:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:00.046 13:18:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.046 13:18:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:00.046 13:18:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.046 13:18:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:00.046 13:18:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.046 13:18:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:00.046 13:18:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.949 13:18:20 -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:01.949 13:18:20 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:06:01.949 13:18:20 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:01.949 13:18:20 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:01.949 13:18:20 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:01.949 13:18:20 -- setup/devices.sh@110 -- # cleanup_nvme 00:06:01.949 13:18:20 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:01.949 13:18:20 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:01.949 13:18:20 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:01.949 13:18:20 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:01.949 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:01.949 13:18:20 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:01.949 13:18:20 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:02.208 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:02.208 /dev/nvme0n1: 8 bytes were erased at offset 0x3a3817d5e00 (gpt): 45 46 49 20 50 41 52 54 00:06:02.208 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:02.208 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:02.208 13:18:20 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:06:02.208 13:18:20 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:06:02.208 13:18:20 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:02.208 13:18:20 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:06:02.208 13:18:20 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:06:02.208 13:18:20 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:02.208 13:18:20 -- setup/devices.sh@116 -- # verify 0000:1a:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:02.208 13:18:20 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:06:02.209 13:18:20 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:06:02.209 13:18:20 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:02.209 13:18:20 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:02.209 13:18:20 -- setup/devices.sh@53 -- # local found=0 00:06:02.209 13:18:20 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:02.209 13:18:20 -- setup/devices.sh@56 -- # : 00:06:02.209 13:18:20 -- setup/devices.sh@59 -- # local pci status 00:06:02.209 13:18:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.209 13:18:20 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:06:02.209 13:18:20 -- setup/devices.sh@47 -- # setup output config 00:06:02.209 13:18:20 -- setup/common.sh@9 -- # [[ output == output ]] 00:06:02.209 13:18:20 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:06.398 13:18:24 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:06.398 13:18:24 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:06:06.398 13:18:24 -- setup/devices.sh@63 -- # found=1 00:06:06.398 13:18:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.398 13:18:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:06.398 13:18:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.398 13:18:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:06.398 13:18:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.398 13:18:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:06.398 13:18:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.398 13:18:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:06.398 13:18:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.398 13:18:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:06.398 13:18:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.398 13:18:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:06.398 13:18:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.398 13:18:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:06.398 13:18:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.398 13:18:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:06.398 13:18:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.398 13:18:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:06.398 13:18:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.398 13:18:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:06.398 13:18:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.398 13:18:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:06.398 13:18:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.398 13:18:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:06.398 13:18:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.398 13:18:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:06.398 13:18:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.398 13:18:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:06.398 13:18:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.398 13:18:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:06.398 13:18:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.398 13:18:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:06.398 13:18:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:07.840 13:18:26 -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:07.840 13:18:26 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:06:07.840 13:18:26 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:07.840 13:18:26 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:07.840 13:18:26 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:07.840 13:18:26 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:07.840 13:18:26 -- setup/devices.sh@125 -- # verify 0000:1a:00.0 data@nvme0n1 '' '' 00:06:07.840 13:18:26 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:06:07.840 13:18:26 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:06:07.840 13:18:26 -- setup/devices.sh@50 -- # local mount_point= 00:06:07.840 13:18:26 -- setup/devices.sh@51 -- # local test_file= 00:06:07.840 13:18:26 -- setup/devices.sh@53 -- # local found=0 00:06:07.840 13:18:26 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:07.840 13:18:26 -- setup/devices.sh@59 -- # local pci status 00:06:07.840 13:18:26 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:07.840 13:18:26 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:06:07.840 13:18:26 -- setup/devices.sh@47 -- # setup output config 00:06:07.840 13:18:26 -- setup/common.sh@9 -- # [[ output == output ]] 00:06:07.840 13:18:26 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:12.033 13:18:30 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:12.033 13:18:30 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:06:12.033 13:18:30 -- setup/devices.sh@63 -- # found=1 00:06:12.033 13:18:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.033 13:18:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:12.033 13:18:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.033 13:18:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:12.033 13:18:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.033 13:18:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:12.033 13:18:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.033 13:18:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:12.033 13:18:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.033 13:18:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:12.033 13:18:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.033 13:18:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:12.033 13:18:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.033 13:18:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:12.033 13:18:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.033 13:18:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:12.033 13:18:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.033 13:18:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:12.033 13:18:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.033 13:18:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:12.033 13:18:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.033 13:18:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:12.033 13:18:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.033 13:18:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:12.033 13:18:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.033 13:18:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:12.033 13:18:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.033 13:18:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:12.033 13:18:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.033 13:18:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:12.033 13:18:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.033 13:18:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:12.033 13:18:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.946 13:18:32 -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:13.946 13:18:32 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:13.946 13:18:32 -- setup/devices.sh@68 -- # return 0 00:06:13.946 13:18:32 -- setup/devices.sh@128 -- # cleanup_nvme 00:06:13.946 13:18:32 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:13.946 13:18:32 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:13.946 13:18:32 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:13.946 13:18:32 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:13.946 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:13.946 00:06:13.946 real 0m20.158s 00:06:13.946 user 0m5.853s 00:06:13.946 sys 0m12.160s 00:06:13.946 13:18:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:13.946 13:18:32 -- common/autotest_common.sh@10 -- # set +x 00:06:13.946 ************************************ 00:06:13.946 END TEST nvme_mount 00:06:13.946 ************************************ 00:06:13.946 13:18:32 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:06:13.946 13:18:32 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:13.946 13:18:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:13.946 13:18:32 -- common/autotest_common.sh@10 -- # set +x 00:06:13.946 ************************************ 00:06:13.946 START TEST dm_mount 00:06:13.946 ************************************ 00:06:13.946 13:18:32 -- common/autotest_common.sh@1104 -- # dm_mount 00:06:13.946 13:18:32 -- setup/devices.sh@144 -- # pv=nvme0n1 00:06:13.946 13:18:32 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:06:13.946 13:18:32 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:06:13.946 13:18:32 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:06:13.946 13:18:32 -- setup/common.sh@39 -- # local disk=nvme0n1 00:06:13.946 13:18:32 -- setup/common.sh@40 -- # local part_no=2 00:06:13.946 13:18:32 -- setup/common.sh@41 -- # local size=1073741824 00:06:13.946 13:18:32 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:06:13.946 13:18:32 -- setup/common.sh@44 -- # parts=() 00:06:13.946 13:18:32 -- setup/common.sh@44 -- # local parts 00:06:13.946 13:18:32 -- setup/common.sh@46 -- # (( part = 1 )) 00:06:13.946 13:18:32 -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:13.946 13:18:32 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:13.946 13:18:32 -- setup/common.sh@46 -- # (( part++ )) 00:06:13.946 13:18:32 -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:13.946 13:18:32 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:13.946 13:18:32 -- setup/common.sh@46 -- # (( part++ )) 00:06:13.946 13:18:32 -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:13.946 13:18:32 -- setup/common.sh@51 -- # (( size /= 512 )) 00:06:13.946 13:18:32 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:06:13.946 13:18:32 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:06:14.926 Creating new GPT entries in memory. 00:06:14.927 GPT data structures destroyed! You may now partition the disk using fdisk or 00:06:14.927 other utilities. 00:06:14.927 13:18:33 -- setup/common.sh@57 -- # (( part = 1 )) 00:06:14.927 13:18:33 -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:14.927 13:18:33 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:14.927 13:18:33 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:14.927 13:18:33 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:06:16.306 Creating new GPT entries in memory. 00:06:16.306 The operation has completed successfully. 00:06:16.306 13:18:34 -- setup/common.sh@57 -- # (( part++ )) 00:06:16.306 13:18:34 -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:16.306 13:18:34 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:16.306 13:18:34 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:16.306 13:18:34 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:06:17.243 The operation has completed successfully. 00:06:17.243 13:18:35 -- setup/common.sh@57 -- # (( part++ )) 00:06:17.243 13:18:35 -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:17.243 13:18:35 -- setup/common.sh@62 -- # wait 3134684 00:06:17.243 13:18:35 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:06:17.243 13:18:35 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:17.243 13:18:35 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:17.243 13:18:35 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:06:17.243 13:18:35 -- setup/devices.sh@160 -- # for t in {1..5} 00:06:17.243 13:18:35 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:17.243 13:18:35 -- setup/devices.sh@161 -- # break 00:06:17.243 13:18:35 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:17.243 13:18:35 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:06:17.243 13:18:35 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:06:17.243 13:18:35 -- setup/devices.sh@166 -- # dm=dm-0 00:06:17.243 13:18:35 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:06:17.243 13:18:35 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:06:17.243 13:18:35 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:17.243 13:18:35 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:06:17.243 13:18:35 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:17.243 13:18:35 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:17.243 13:18:35 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:06:17.243 13:18:35 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:17.243 13:18:35 -- setup/devices.sh@174 -- # verify 0000:1a:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:17.243 13:18:35 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:06:17.243 13:18:35 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:06:17.243 13:18:35 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:17.243 13:18:35 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:17.243 13:18:35 -- setup/devices.sh@53 -- # local found=0 00:06:17.243 13:18:35 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:17.243 13:18:35 -- setup/devices.sh@56 -- # : 00:06:17.243 13:18:35 -- setup/devices.sh@59 -- # local pci status 00:06:17.243 13:18:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.243 13:18:35 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:06:17.243 13:18:35 -- setup/devices.sh@47 -- # setup output config 00:06:17.243 13:18:35 -- setup/common.sh@9 -- # [[ output == output ]] 00:06:17.243 13:18:35 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:21.435 13:18:39 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:21.435 13:18:39 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:06:21.435 13:18:39 -- setup/devices.sh@63 -- # found=1 00:06:21.435 13:18:39 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:21.435 13:18:39 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:21.435 13:18:39 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:21.435 13:18:39 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:21.435 13:18:39 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:21.435 13:18:39 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:21.435 13:18:39 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:21.435 13:18:39 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:21.435 13:18:39 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:21.435 13:18:39 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:21.435 13:18:39 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:21.435 13:18:39 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:21.435 13:18:39 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:21.435 13:18:39 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:21.435 13:18:39 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:21.435 13:18:39 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:21.435 13:18:39 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:21.435 13:18:39 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:21.435 13:18:39 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:21.435 13:18:39 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:21.435 13:18:39 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:21.435 13:18:39 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:21.435 13:18:39 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:21.435 13:18:39 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:21.435 13:18:39 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:21.435 13:18:39 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:21.435 13:18:39 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:21.435 13:18:39 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:21.435 13:18:39 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:21.435 13:18:39 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:21.435 13:18:39 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:21.435 13:18:39 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:21.435 13:18:39 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.341 13:18:41 -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:23.341 13:18:41 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:06:23.341 13:18:41 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:23.341 13:18:41 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:23.341 13:18:41 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:23.341 13:18:41 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:23.341 13:18:41 -- setup/devices.sh@184 -- # verify 0000:1a:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:06:23.341 13:18:41 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:06:23.341 13:18:41 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:06:23.341 13:18:41 -- setup/devices.sh@50 -- # local mount_point= 00:06:23.341 13:18:41 -- setup/devices.sh@51 -- # local test_file= 00:06:23.341 13:18:41 -- setup/devices.sh@53 -- # local found=0 00:06:23.341 13:18:41 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:23.341 13:18:41 -- setup/devices.sh@59 -- # local pci status 00:06:23.341 13:18:41 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:06:23.341 13:18:41 -- setup/devices.sh@47 -- # setup output config 00:06:23.341 13:18:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.341 13:18:41 -- setup/common.sh@9 -- # [[ output == output ]] 00:06:23.341 13:18:41 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:27.537 13:18:45 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.537 13:18:45 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:06:27.537 13:18:45 -- setup/devices.sh@63 -- # found=1 00:06:27.537 13:18:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.537 13:18:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.537 13:18:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.537 13:18:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.537 13:18:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.537 13:18:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.537 13:18:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.537 13:18:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.537 13:18:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.537 13:18:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.537 13:18:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.537 13:18:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.537 13:18:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.537 13:18:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.537 13:18:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.537 13:18:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.537 13:18:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.537 13:18:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.537 13:18:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.537 13:18:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.537 13:18:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.537 13:18:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.537 13:18:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.537 13:18:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.537 13:18:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.537 13:18:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.537 13:18:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.537 13:18:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.537 13:18:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.537 13:18:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.537 13:18:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.537 13:18:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.537 13:18:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:29.442 13:18:47 -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:29.442 13:18:47 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:29.442 13:18:47 -- setup/devices.sh@68 -- # return 0 00:06:29.442 13:18:47 -- setup/devices.sh@187 -- # cleanup_dm 00:06:29.442 13:18:47 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:29.442 13:18:47 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:29.442 13:18:47 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:06:29.442 13:18:47 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:29.442 13:18:47 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:06:29.442 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:29.442 13:18:47 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:29.442 13:18:47 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:06:29.442 00:06:29.442 real 0m15.211s 00:06:29.442 user 0m4.069s 00:06:29.443 sys 0m8.146s 00:06:29.443 13:18:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:29.443 13:18:47 -- common/autotest_common.sh@10 -- # set +x 00:06:29.443 ************************************ 00:06:29.443 END TEST dm_mount 00:06:29.443 ************************************ 00:06:29.443 13:18:48 -- setup/devices.sh@1 -- # cleanup 00:06:29.443 13:18:48 -- setup/devices.sh@11 -- # cleanup_nvme 00:06:29.443 13:18:48 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:29.443 13:18:48 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:29.443 13:18:48 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:29.443 13:18:48 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:29.443 13:18:48 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:29.443 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:29.443 /dev/nvme0n1: 8 bytes were erased at offset 0x3a3817d5e00 (gpt): 45 46 49 20 50 41 52 54 00:06:29.443 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:29.443 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:29.443 13:18:48 -- setup/devices.sh@12 -- # cleanup_dm 00:06:29.443 13:18:48 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:29.443 13:18:48 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:29.443 13:18:48 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:29.443 13:18:48 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:29.443 13:18:48 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:06:29.443 13:18:48 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:06:29.443 00:06:29.443 real 0m42.429s 00:06:29.443 user 0m12.186s 00:06:29.443 sys 0m25.020s 00:06:29.443 13:18:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:29.443 13:18:48 -- common/autotest_common.sh@10 -- # set +x 00:06:29.443 ************************************ 00:06:29.443 END TEST devices 00:06:29.443 ************************************ 00:06:29.702 00:06:29.702 real 2m32.626s 00:06:29.702 user 0m45.365s 00:06:29.702 sys 1m31.060s 00:06:29.702 13:18:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:29.702 13:18:48 -- common/autotest_common.sh@10 -- # set +x 00:06:29.702 ************************************ 00:06:29.702 END TEST setup.sh 00:06:29.702 ************************************ 00:06:29.702 13:18:48 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:06:33.896 Hugepages 00:06:33.896 node hugesize free / total 00:06:33.896 node0 1048576kB 0 / 0 00:06:33.896 node0 2048kB 2048 / 2048 00:06:33.896 node1 1048576kB 0 / 0 00:06:33.896 node1 2048kB 0 / 0 00:06:33.896 00:06:33.896 Type BDF Vendor Device NUMA Driver Device Block devices 00:06:33.896 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:06:33.896 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:06:33.896 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:06:33.896 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:06:33.896 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:06:33.896 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:06:33.896 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:06:33.896 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:06:33.896 NVMe 0000:1a:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:06:33.896 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:06:33.896 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:06:33.896 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:06:33.896 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:06:33.896 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:06:33.896 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:06:33.896 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:06:33.896 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:06:33.896 13:18:52 -- spdk/autotest.sh@141 -- # uname -s 00:06:33.896 13:18:52 -- spdk/autotest.sh@141 -- # [[ Linux == Linux ]] 00:06:33.896 13:18:52 -- spdk/autotest.sh@143 -- # nvme_namespace_revert 00:06:33.896 13:18:52 -- common/autotest_common.sh@1516 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:38.093 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:38.093 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:38.093 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:38.093 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:38.093 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:38.093 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:38.093 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:38.093 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:38.093 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:38.093 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:38.093 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:38.093 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:38.093 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:38.093 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:38.093 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:38.093 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:41.412 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:06:42.788 13:19:01 -- common/autotest_common.sh@1517 -- # sleep 1 00:06:43.724 13:19:02 -- common/autotest_common.sh@1518 -- # bdfs=() 00:06:43.724 13:19:02 -- common/autotest_common.sh@1518 -- # local bdfs 00:06:43.724 13:19:02 -- common/autotest_common.sh@1519 -- # bdfs=($(get_nvme_bdfs)) 00:06:43.724 13:19:02 -- common/autotest_common.sh@1519 -- # get_nvme_bdfs 00:06:43.724 13:19:02 -- common/autotest_common.sh@1498 -- # bdfs=() 00:06:43.724 13:19:02 -- common/autotest_common.sh@1498 -- # local bdfs 00:06:43.724 13:19:02 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:43.724 13:19:02 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:43.724 13:19:02 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:06:43.983 13:19:02 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:06:43.983 13:19:02 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:1a:00.0 00:06:43.983 13:19:02 -- common/autotest_common.sh@1521 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:06:48.175 Waiting for block devices as requested 00:06:48.175 0000:1a:00.0 (8086 0a54): vfio-pci -> nvme 00:06:48.175 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:48.175 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:48.175 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:48.175 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:48.435 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:48.435 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:48.435 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:48.695 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:48.695 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:48.954 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:48.954 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:48.954 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:49.214 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:49.214 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:49.214 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:49.473 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:51.378 13:19:10 -- common/autotest_common.sh@1523 -- # for bdf in "${bdfs[@]}" 00:06:51.378 13:19:10 -- common/autotest_common.sh@1524 -- # get_nvme_ctrlr_from_bdf 0000:1a:00.0 00:06:51.378 13:19:10 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 00:06:51.378 13:19:10 -- common/autotest_common.sh@1487 -- # grep 0000:1a:00.0/nvme/nvme 00:06:51.378 13:19:10 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:17/0000:17:00.0/0000:18:00.0/0000:19:00.0/0000:1a:00.0/nvme/nvme0 00:06:51.378 13:19:10 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:17/0000:17:00.0/0000:18:00.0/0000:19:00.0/0000:1a:00.0/nvme/nvme0 ]] 00:06:51.378 13:19:10 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:17/0000:17:00.0/0000:18:00.0/0000:19:00.0/0000:1a:00.0/nvme/nvme0 00:06:51.378 13:19:10 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:06:51.378 13:19:10 -- common/autotest_common.sh@1524 -- # nvme_ctrlr=/dev/nvme0 00:06:51.378 13:19:10 -- common/autotest_common.sh@1525 -- # [[ -z /dev/nvme0 ]] 00:06:51.378 13:19:10 -- common/autotest_common.sh@1530 -- # nvme id-ctrl /dev/nvme0 00:06:51.378 13:19:10 -- common/autotest_common.sh@1530 -- # grep oacs 00:06:51.378 13:19:10 -- common/autotest_common.sh@1530 -- # cut -d: -f2 00:06:51.378 13:19:10 -- common/autotest_common.sh@1530 -- # oacs=' 0xe' 00:06:51.378 13:19:10 -- common/autotest_common.sh@1531 -- # oacs_ns_manage=8 00:06:51.378 13:19:10 -- common/autotest_common.sh@1533 -- # [[ 8 -ne 0 ]] 00:06:51.378 13:19:10 -- common/autotest_common.sh@1539 -- # nvme id-ctrl /dev/nvme0 00:06:51.378 13:19:10 -- common/autotest_common.sh@1539 -- # grep unvmcap 00:06:51.378 13:19:10 -- common/autotest_common.sh@1539 -- # cut -d: -f2 00:06:51.378 13:19:10 -- common/autotest_common.sh@1539 -- # unvmcap=' 0' 00:06:51.378 13:19:10 -- common/autotest_common.sh@1540 -- # [[ 0 -eq 0 ]] 00:06:51.378 13:19:10 -- common/autotest_common.sh@1542 -- # continue 00:06:51.378 13:19:10 -- spdk/autotest.sh@146 -- # timing_exit pre_cleanup 00:06:51.378 13:19:10 -- common/autotest_common.sh@718 -- # xtrace_disable 00:06:51.378 13:19:10 -- common/autotest_common.sh@10 -- # set +x 00:06:51.378 13:19:10 -- spdk/autotest.sh@149 -- # timing_enter afterboot 00:06:51.378 13:19:10 -- common/autotest_common.sh@712 -- # xtrace_disable 00:06:51.378 13:19:10 -- common/autotest_common.sh@10 -- # set +x 00:06:51.378 13:19:10 -- spdk/autotest.sh@150 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:55.584 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:55.584 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:55.584 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:55.584 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:55.584 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:55.584 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:55.584 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:55.584 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:55.584 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:55.584 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:55.584 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:55.584 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:55.584 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:55.584 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:55.584 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:55.584 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:58.872 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:07:00.799 13:19:19 -- spdk/autotest.sh@151 -- # timing_exit afterboot 00:07:00.799 13:19:19 -- common/autotest_common.sh@718 -- # xtrace_disable 00:07:00.799 13:19:19 -- common/autotest_common.sh@10 -- # set +x 00:07:00.799 13:19:19 -- spdk/autotest.sh@155 -- # opal_revert_cleanup 00:07:00.799 13:19:19 -- common/autotest_common.sh@1576 -- # mapfile -t bdfs 00:07:00.799 13:19:19 -- common/autotest_common.sh@1576 -- # get_nvme_bdfs_by_id 0x0a54 00:07:00.799 13:19:19 -- common/autotest_common.sh@1562 -- # bdfs=() 00:07:00.799 13:19:19 -- common/autotest_common.sh@1562 -- # local bdfs 00:07:00.799 13:19:19 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:07:00.799 13:19:19 -- common/autotest_common.sh@1498 -- # bdfs=() 00:07:00.799 13:19:19 -- common/autotest_common.sh@1498 -- # local bdfs 00:07:00.799 13:19:19 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:07:00.799 13:19:19 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:07:00.799 13:19:19 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:07:00.799 13:19:19 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:07:00.799 13:19:19 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:1a:00.0 00:07:00.799 13:19:19 -- common/autotest_common.sh@1564 -- # for bdf in $(get_nvme_bdfs) 00:07:00.799 13:19:19 -- common/autotest_common.sh@1565 -- # cat /sys/bus/pci/devices/0000:1a:00.0/device 00:07:00.799 13:19:19 -- common/autotest_common.sh@1565 -- # device=0x0a54 00:07:00.799 13:19:19 -- common/autotest_common.sh@1566 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:07:00.799 13:19:19 -- common/autotest_common.sh@1567 -- # bdfs+=($bdf) 00:07:00.799 13:19:19 -- common/autotest_common.sh@1571 -- # printf '%s\n' 0000:1a:00.0 00:07:00.799 13:19:19 -- common/autotest_common.sh@1577 -- # [[ -z 0000:1a:00.0 ]] 00:07:00.799 13:19:19 -- common/autotest_common.sh@1582 -- # spdk_tgt_pid=3145919 00:07:00.799 13:19:19 -- common/autotest_common.sh@1581 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:00.799 13:19:19 -- common/autotest_common.sh@1583 -- # waitforlisten 3145919 00:07:00.799 13:19:19 -- common/autotest_common.sh@819 -- # '[' -z 3145919 ']' 00:07:00.799 13:19:19 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:00.799 13:19:19 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:00.799 13:19:19 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:00.799 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:00.799 13:19:19 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:00.799 13:19:19 -- common/autotest_common.sh@10 -- # set +x 00:07:00.799 [2024-07-24 13:19:19.564656] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:00.799 [2024-07-24 13:19:19.564743] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3145919 ] 00:07:00.799 EAL: No free 2048 kB hugepages reported on node 1 00:07:01.058 [2024-07-24 13:19:19.682344] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.059 [2024-07-24 13:19:19.726582] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:01.059 [2024-07-24 13:19:19.726738] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.041 13:19:20 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:02.041 13:19:20 -- common/autotest_common.sh@852 -- # return 0 00:07:02.041 13:19:20 -- common/autotest_common.sh@1585 -- # bdf_id=0 00:07:02.041 13:19:20 -- common/autotest_common.sh@1586 -- # for bdf in "${bdfs[@]}" 00:07:02.042 13:19:20 -- common/autotest_common.sh@1587 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:1a:00.0 00:07:05.340 nvme0n1 00:07:05.340 13:19:23 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:07:05.340 [2024-07-24 13:19:23.837389] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:07:05.340 request: 00:07:05.340 { 00:07:05.340 "nvme_ctrlr_name": "nvme0", 00:07:05.340 "password": "test", 00:07:05.340 "method": "bdev_nvme_opal_revert", 00:07:05.340 "req_id": 1 00:07:05.340 } 00:07:05.340 Got JSON-RPC error response 00:07:05.340 response: 00:07:05.340 { 00:07:05.340 "code": -32602, 00:07:05.340 "message": "Invalid parameters" 00:07:05.340 } 00:07:05.340 13:19:23 -- common/autotest_common.sh@1589 -- # true 00:07:05.340 13:19:23 -- common/autotest_common.sh@1590 -- # (( ++bdf_id )) 00:07:05.340 13:19:23 -- common/autotest_common.sh@1593 -- # killprocess 3145919 00:07:05.340 13:19:23 -- common/autotest_common.sh@926 -- # '[' -z 3145919 ']' 00:07:05.340 13:19:23 -- common/autotest_common.sh@930 -- # kill -0 3145919 00:07:05.340 13:19:23 -- common/autotest_common.sh@931 -- # uname 00:07:05.340 13:19:23 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:05.340 13:19:23 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3145919 00:07:05.340 13:19:23 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:05.340 13:19:23 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:05.340 13:19:23 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3145919' 00:07:05.340 killing process with pid 3145919 00:07:05.340 13:19:23 -- common/autotest_common.sh@945 -- # kill 3145919 00:07:05.340 13:19:23 -- common/autotest_common.sh@950 -- # wait 3145919 00:07:09.530 13:19:27 -- spdk/autotest.sh@161 -- # '[' 0 -eq 1 ']' 00:07:09.531 13:19:27 -- spdk/autotest.sh@165 -- # '[' 1 -eq 1 ']' 00:07:09.531 13:19:27 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:07:09.531 13:19:27 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:07:09.531 13:19:27 -- spdk/autotest.sh@173 -- # timing_enter lib 00:07:09.531 13:19:27 -- common/autotest_common.sh@712 -- # xtrace_disable 00:07:09.531 13:19:27 -- common/autotest_common.sh@10 -- # set +x 00:07:09.531 13:19:27 -- spdk/autotest.sh@175 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:07:09.531 13:19:27 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:09.531 13:19:27 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:09.531 13:19:27 -- common/autotest_common.sh@10 -- # set +x 00:07:09.531 ************************************ 00:07:09.531 START TEST env 00:07:09.531 ************************************ 00:07:09.531 13:19:27 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:07:09.531 * Looking for test storage... 00:07:09.531 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:07:09.531 13:19:27 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:07:09.531 13:19:27 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:09.531 13:19:27 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:09.531 13:19:27 -- common/autotest_common.sh@10 -- # set +x 00:07:09.531 ************************************ 00:07:09.531 START TEST env_memory 00:07:09.531 ************************************ 00:07:09.531 13:19:27 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:07:09.531 00:07:09.531 00:07:09.531 CUnit - A unit testing framework for C - Version 2.1-3 00:07:09.531 http://cunit.sourceforge.net/ 00:07:09.531 00:07:09.531 00:07:09.531 Suite: memory 00:07:09.531 Test: alloc and free memory map ...[2024-07-24 13:19:28.027558] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:07:09.531 passed 00:07:09.531 Test: mem map translation ...[2024-07-24 13:19:28.047890] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:07:09.531 [2024-07-24 13:19:28.047916] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:07:09.531 [2024-07-24 13:19:28.047963] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:07:09.531 [2024-07-24 13:19:28.047976] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:07:09.531 passed 00:07:09.531 Test: mem map registration ...[2024-07-24 13:19:28.082322] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:07:09.531 [2024-07-24 13:19:28.082345] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:07:09.531 passed 00:07:09.531 Test: mem map adjacent registrations ...passed 00:07:09.531 00:07:09.531 Run Summary: Type Total Ran Passed Failed Inactive 00:07:09.531 suites 1 1 n/a 0 0 00:07:09.531 tests 4 4 4 0 0 00:07:09.531 asserts 152 152 152 0 n/a 00:07:09.531 00:07:09.531 Elapsed time = 0.126 seconds 00:07:09.531 00:07:09.531 real 0m0.140s 00:07:09.531 user 0m0.127s 00:07:09.531 sys 0m0.013s 00:07:09.531 13:19:28 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:09.531 13:19:28 -- common/autotest_common.sh@10 -- # set +x 00:07:09.531 ************************************ 00:07:09.531 END TEST env_memory 00:07:09.531 ************************************ 00:07:09.531 13:19:28 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:07:09.531 13:19:28 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:09.531 13:19:28 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:09.531 13:19:28 -- common/autotest_common.sh@10 -- # set +x 00:07:09.531 ************************************ 00:07:09.531 START TEST env_vtophys 00:07:09.531 ************************************ 00:07:09.531 13:19:28 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:07:09.531 EAL: lib.eal log level changed from notice to debug 00:07:09.531 EAL: Detected lcore 0 as core 0 on socket 0 00:07:09.531 EAL: Detected lcore 1 as core 1 on socket 0 00:07:09.531 EAL: Detected lcore 2 as core 2 on socket 0 00:07:09.531 EAL: Detected lcore 3 as core 3 on socket 0 00:07:09.531 EAL: Detected lcore 4 as core 4 on socket 0 00:07:09.531 EAL: Detected lcore 5 as core 8 on socket 0 00:07:09.531 EAL: Detected lcore 6 as core 9 on socket 0 00:07:09.531 EAL: Detected lcore 7 as core 10 on socket 0 00:07:09.531 EAL: Detected lcore 8 as core 11 on socket 0 00:07:09.531 EAL: Detected lcore 9 as core 16 on socket 0 00:07:09.531 EAL: Detected lcore 10 as core 17 on socket 0 00:07:09.531 EAL: Detected lcore 11 as core 18 on socket 0 00:07:09.531 EAL: Detected lcore 12 as core 19 on socket 0 00:07:09.531 EAL: Detected lcore 13 as core 20 on socket 0 00:07:09.531 EAL: Detected lcore 14 as core 24 on socket 0 00:07:09.531 EAL: Detected lcore 15 as core 25 on socket 0 00:07:09.531 EAL: Detected lcore 16 as core 26 on socket 0 00:07:09.531 EAL: Detected lcore 17 as core 27 on socket 0 00:07:09.531 EAL: Detected lcore 18 as core 0 on socket 1 00:07:09.531 EAL: Detected lcore 19 as core 1 on socket 1 00:07:09.531 EAL: Detected lcore 20 as core 2 on socket 1 00:07:09.531 EAL: Detected lcore 21 as core 3 on socket 1 00:07:09.531 EAL: Detected lcore 22 as core 4 on socket 1 00:07:09.531 EAL: Detected lcore 23 as core 8 on socket 1 00:07:09.531 EAL: Detected lcore 24 as core 9 on socket 1 00:07:09.531 EAL: Detected lcore 25 as core 10 on socket 1 00:07:09.531 EAL: Detected lcore 26 as core 11 on socket 1 00:07:09.531 EAL: Detected lcore 27 as core 16 on socket 1 00:07:09.531 EAL: Detected lcore 28 as core 17 on socket 1 00:07:09.531 EAL: Detected lcore 29 as core 18 on socket 1 00:07:09.531 EAL: Detected lcore 30 as core 19 on socket 1 00:07:09.531 EAL: Detected lcore 31 as core 20 on socket 1 00:07:09.531 EAL: Detected lcore 32 as core 24 on socket 1 00:07:09.531 EAL: Detected lcore 33 as core 25 on socket 1 00:07:09.531 EAL: Detected lcore 34 as core 26 on socket 1 00:07:09.531 EAL: Detected lcore 35 as core 27 on socket 1 00:07:09.531 EAL: Detected lcore 36 as core 0 on socket 0 00:07:09.531 EAL: Detected lcore 37 as core 1 on socket 0 00:07:09.531 EAL: Detected lcore 38 as core 2 on socket 0 00:07:09.531 EAL: Detected lcore 39 as core 3 on socket 0 00:07:09.531 EAL: Detected lcore 40 as core 4 on socket 0 00:07:09.531 EAL: Detected lcore 41 as core 8 on socket 0 00:07:09.531 EAL: Detected lcore 42 as core 9 on socket 0 00:07:09.531 EAL: Detected lcore 43 as core 10 on socket 0 00:07:09.531 EAL: Detected lcore 44 as core 11 on socket 0 00:07:09.531 EAL: Detected lcore 45 as core 16 on socket 0 00:07:09.531 EAL: Detected lcore 46 as core 17 on socket 0 00:07:09.531 EAL: Detected lcore 47 as core 18 on socket 0 00:07:09.531 EAL: Detected lcore 48 as core 19 on socket 0 00:07:09.531 EAL: Detected lcore 49 as core 20 on socket 0 00:07:09.531 EAL: Detected lcore 50 as core 24 on socket 0 00:07:09.531 EAL: Detected lcore 51 as core 25 on socket 0 00:07:09.531 EAL: Detected lcore 52 as core 26 on socket 0 00:07:09.531 EAL: Detected lcore 53 as core 27 on socket 0 00:07:09.531 EAL: Detected lcore 54 as core 0 on socket 1 00:07:09.531 EAL: Detected lcore 55 as core 1 on socket 1 00:07:09.531 EAL: Detected lcore 56 as core 2 on socket 1 00:07:09.531 EAL: Detected lcore 57 as core 3 on socket 1 00:07:09.532 EAL: Detected lcore 58 as core 4 on socket 1 00:07:09.532 EAL: Detected lcore 59 as core 8 on socket 1 00:07:09.532 EAL: Detected lcore 60 as core 9 on socket 1 00:07:09.532 EAL: Detected lcore 61 as core 10 on socket 1 00:07:09.532 EAL: Detected lcore 62 as core 11 on socket 1 00:07:09.532 EAL: Detected lcore 63 as core 16 on socket 1 00:07:09.532 EAL: Detected lcore 64 as core 17 on socket 1 00:07:09.532 EAL: Detected lcore 65 as core 18 on socket 1 00:07:09.532 EAL: Detected lcore 66 as core 19 on socket 1 00:07:09.532 EAL: Detected lcore 67 as core 20 on socket 1 00:07:09.532 EAL: Detected lcore 68 as core 24 on socket 1 00:07:09.532 EAL: Detected lcore 69 as core 25 on socket 1 00:07:09.532 EAL: Detected lcore 70 as core 26 on socket 1 00:07:09.532 EAL: Detected lcore 71 as core 27 on socket 1 00:07:09.532 EAL: Maximum logical cores by configuration: 128 00:07:09.532 EAL: Detected CPU lcores: 72 00:07:09.532 EAL: Detected NUMA nodes: 2 00:07:09.532 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:07:09.532 EAL: Checking presence of .so 'librte_eal.so.24' 00:07:09.532 EAL: Checking presence of .so 'librte_eal.so' 00:07:09.532 EAL: Detected static linkage of DPDK 00:07:09.532 EAL: No shared files mode enabled, IPC will be disabled 00:07:09.532 EAL: Bus pci wants IOVA as 'DC' 00:07:09.532 EAL: Buses did not request a specific IOVA mode. 00:07:09.532 EAL: IOMMU is available, selecting IOVA as VA mode. 00:07:09.532 EAL: Selected IOVA mode 'VA' 00:07:09.532 EAL: No free 2048 kB hugepages reported on node 1 00:07:09.532 EAL: Probing VFIO support... 00:07:09.532 EAL: IOMMU type 1 (Type 1) is supported 00:07:09.532 EAL: IOMMU type 7 (sPAPR) is not supported 00:07:09.532 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:07:09.532 EAL: VFIO support initialized 00:07:09.532 EAL: Ask a virtual area of 0x2e000 bytes 00:07:09.532 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:07:09.532 EAL: Setting up physically contiguous memory... 00:07:09.532 EAL: Setting maximum number of open files to 524288 00:07:09.532 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:07:09.532 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:07:09.532 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:07:09.532 EAL: Ask a virtual area of 0x61000 bytes 00:07:09.532 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:07:09.532 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:09.532 EAL: Ask a virtual area of 0x400000000 bytes 00:07:09.532 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:07:09.532 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:07:09.532 EAL: Ask a virtual area of 0x61000 bytes 00:07:09.532 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:07:09.532 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:09.532 EAL: Ask a virtual area of 0x400000000 bytes 00:07:09.532 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:07:09.532 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:07:09.532 EAL: Ask a virtual area of 0x61000 bytes 00:07:09.532 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:07:09.532 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:09.532 EAL: Ask a virtual area of 0x400000000 bytes 00:07:09.532 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:07:09.532 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:07:09.532 EAL: Ask a virtual area of 0x61000 bytes 00:07:09.532 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:07:09.532 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:09.532 EAL: Ask a virtual area of 0x400000000 bytes 00:07:09.532 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:07:09.532 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:07:09.532 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:07:09.532 EAL: Ask a virtual area of 0x61000 bytes 00:07:09.532 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:07:09.532 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:09.532 EAL: Ask a virtual area of 0x400000000 bytes 00:07:09.532 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:07:09.532 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:07:09.532 EAL: Ask a virtual area of 0x61000 bytes 00:07:09.532 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:07:09.532 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:09.532 EAL: Ask a virtual area of 0x400000000 bytes 00:07:09.532 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:07:09.532 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:07:09.532 EAL: Ask a virtual area of 0x61000 bytes 00:07:09.532 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:07:09.532 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:09.532 EAL: Ask a virtual area of 0x400000000 bytes 00:07:09.532 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:07:09.532 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:07:09.532 EAL: Ask a virtual area of 0x61000 bytes 00:07:09.532 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:07:09.532 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:09.532 EAL: Ask a virtual area of 0x400000000 bytes 00:07:09.532 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:07:09.532 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:07:09.532 EAL: Hugepages will be freed exactly as allocated. 00:07:09.532 EAL: No shared files mode enabled, IPC is disabled 00:07:09.532 EAL: No shared files mode enabled, IPC is disabled 00:07:09.532 EAL: TSC frequency is ~2300000 KHz 00:07:09.532 EAL: Main lcore 0 is ready (tid=7fd73d165a00;cpuset=[0]) 00:07:09.532 EAL: Trying to obtain current memory policy. 00:07:09.532 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:09.532 EAL: Restoring previous memory policy: 0 00:07:09.532 EAL: request: mp_malloc_sync 00:07:09.532 EAL: No shared files mode enabled, IPC is disabled 00:07:09.532 EAL: Heap on socket 0 was expanded by 2MB 00:07:09.532 EAL: No shared files mode enabled, IPC is disabled 00:07:09.532 EAL: Mem event callback 'spdk:(nil)' registered 00:07:09.532 00:07:09.532 00:07:09.532 CUnit - A unit testing framework for C - Version 2.1-3 00:07:09.532 http://cunit.sourceforge.net/ 00:07:09.532 00:07:09.532 00:07:09.532 Suite: components_suite 00:07:09.532 Test: vtophys_malloc_test ...passed 00:07:09.532 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:07:09.532 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:09.532 EAL: Restoring previous memory policy: 4 00:07:09.532 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.532 EAL: request: mp_malloc_sync 00:07:09.532 EAL: No shared files mode enabled, IPC is disabled 00:07:09.532 EAL: Heap on socket 0 was expanded by 4MB 00:07:09.532 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.532 EAL: request: mp_malloc_sync 00:07:09.532 EAL: No shared files mode enabled, IPC is disabled 00:07:09.532 EAL: Heap on socket 0 was shrunk by 4MB 00:07:09.532 EAL: Trying to obtain current memory policy. 00:07:09.532 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:09.532 EAL: Restoring previous memory policy: 4 00:07:09.532 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.532 EAL: request: mp_malloc_sync 00:07:09.532 EAL: No shared files mode enabled, IPC is disabled 00:07:09.532 EAL: Heap on socket 0 was expanded by 6MB 00:07:09.532 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.532 EAL: request: mp_malloc_sync 00:07:09.532 EAL: No shared files mode enabled, IPC is disabled 00:07:09.532 EAL: Heap on socket 0 was shrunk by 6MB 00:07:09.532 EAL: Trying to obtain current memory policy. 00:07:09.532 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:09.532 EAL: Restoring previous memory policy: 4 00:07:09.532 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.532 EAL: request: mp_malloc_sync 00:07:09.532 EAL: No shared files mode enabled, IPC is disabled 00:07:09.532 EAL: Heap on socket 0 was expanded by 10MB 00:07:09.532 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.532 EAL: request: mp_malloc_sync 00:07:09.532 EAL: No shared files mode enabled, IPC is disabled 00:07:09.532 EAL: Heap on socket 0 was shrunk by 10MB 00:07:09.532 EAL: Trying to obtain current memory policy. 00:07:09.532 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:09.532 EAL: Restoring previous memory policy: 4 00:07:09.532 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.532 EAL: request: mp_malloc_sync 00:07:09.532 EAL: No shared files mode enabled, IPC is disabled 00:07:09.532 EAL: Heap on socket 0 was expanded by 18MB 00:07:09.532 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.532 EAL: request: mp_malloc_sync 00:07:09.532 EAL: No shared files mode enabled, IPC is disabled 00:07:09.532 EAL: Heap on socket 0 was shrunk by 18MB 00:07:09.532 EAL: Trying to obtain current memory policy. 00:07:09.532 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:09.532 EAL: Restoring previous memory policy: 4 00:07:09.532 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.532 EAL: request: mp_malloc_sync 00:07:09.532 EAL: No shared files mode enabled, IPC is disabled 00:07:09.532 EAL: Heap on socket 0 was expanded by 34MB 00:07:09.532 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.532 EAL: request: mp_malloc_sync 00:07:09.532 EAL: No shared files mode enabled, IPC is disabled 00:07:09.532 EAL: Heap on socket 0 was shrunk by 34MB 00:07:09.532 EAL: Trying to obtain current memory policy. 00:07:09.532 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:09.532 EAL: Restoring previous memory policy: 4 00:07:09.532 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.532 EAL: request: mp_malloc_sync 00:07:09.532 EAL: No shared files mode enabled, IPC is disabled 00:07:09.532 EAL: Heap on socket 0 was expanded by 66MB 00:07:09.532 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.532 EAL: request: mp_malloc_sync 00:07:09.532 EAL: No shared files mode enabled, IPC is disabled 00:07:09.532 EAL: Heap on socket 0 was shrunk by 66MB 00:07:09.532 EAL: Trying to obtain current memory policy. 00:07:09.532 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:09.532 EAL: Restoring previous memory policy: 4 00:07:09.532 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.532 EAL: request: mp_malloc_sync 00:07:09.533 EAL: No shared files mode enabled, IPC is disabled 00:07:09.533 EAL: Heap on socket 0 was expanded by 130MB 00:07:09.792 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.792 EAL: request: mp_malloc_sync 00:07:09.792 EAL: No shared files mode enabled, IPC is disabled 00:07:09.792 EAL: Heap on socket 0 was shrunk by 130MB 00:07:09.792 EAL: Trying to obtain current memory policy. 00:07:09.792 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:09.792 EAL: Restoring previous memory policy: 4 00:07:09.792 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.792 EAL: request: mp_malloc_sync 00:07:09.792 EAL: No shared files mode enabled, IPC is disabled 00:07:09.792 EAL: Heap on socket 0 was expanded by 258MB 00:07:09.792 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.792 EAL: request: mp_malloc_sync 00:07:09.792 EAL: No shared files mode enabled, IPC is disabled 00:07:09.792 EAL: Heap on socket 0 was shrunk by 258MB 00:07:09.792 EAL: Trying to obtain current memory policy. 00:07:09.792 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:10.051 EAL: Restoring previous memory policy: 4 00:07:10.051 EAL: Calling mem event callback 'spdk:(nil)' 00:07:10.051 EAL: request: mp_malloc_sync 00:07:10.051 EAL: No shared files mode enabled, IPC is disabled 00:07:10.051 EAL: Heap on socket 0 was expanded by 514MB 00:07:10.051 EAL: Calling mem event callback 'spdk:(nil)' 00:07:10.051 EAL: request: mp_malloc_sync 00:07:10.051 EAL: No shared files mode enabled, IPC is disabled 00:07:10.051 EAL: Heap on socket 0 was shrunk by 514MB 00:07:10.051 EAL: Trying to obtain current memory policy. 00:07:10.051 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:10.311 EAL: Restoring previous memory policy: 4 00:07:10.311 EAL: Calling mem event callback 'spdk:(nil)' 00:07:10.311 EAL: request: mp_malloc_sync 00:07:10.311 EAL: No shared files mode enabled, IPC is disabled 00:07:10.311 EAL: Heap on socket 0 was expanded by 1026MB 00:07:10.570 EAL: Calling mem event callback 'spdk:(nil)' 00:07:10.829 EAL: request: mp_malloc_sync 00:07:10.829 EAL: No shared files mode enabled, IPC is disabled 00:07:10.829 EAL: Heap on socket 0 was shrunk by 1026MB 00:07:10.829 passed 00:07:10.829 00:07:10.829 Run Summary: Type Total Ran Passed Failed Inactive 00:07:10.829 suites 1 1 n/a 0 0 00:07:10.829 tests 2 2 2 0 0 00:07:10.829 asserts 497 497 497 0 n/a 00:07:10.829 00:07:10.829 Elapsed time = 1.162 seconds 00:07:10.829 EAL: Calling mem event callback 'spdk:(nil)' 00:07:10.829 EAL: request: mp_malloc_sync 00:07:10.829 EAL: No shared files mode enabled, IPC is disabled 00:07:10.829 EAL: Heap on socket 0 was shrunk by 2MB 00:07:10.829 EAL: No shared files mode enabled, IPC is disabled 00:07:10.829 EAL: No shared files mode enabled, IPC is disabled 00:07:10.829 EAL: No shared files mode enabled, IPC is disabled 00:07:10.829 00:07:10.829 real 0m1.335s 00:07:10.829 user 0m0.751s 00:07:10.829 sys 0m0.554s 00:07:10.829 13:19:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:10.829 13:19:29 -- common/autotest_common.sh@10 -- # set +x 00:07:10.829 ************************************ 00:07:10.829 END TEST env_vtophys 00:07:10.829 ************************************ 00:07:10.829 13:19:29 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:07:10.829 13:19:29 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:10.829 13:19:29 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:10.829 13:19:29 -- common/autotest_common.sh@10 -- # set +x 00:07:10.829 ************************************ 00:07:10.829 START TEST env_pci 00:07:10.829 ************************************ 00:07:10.829 13:19:29 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:07:10.829 00:07:10.829 00:07:10.829 CUnit - A unit testing framework for C - Version 2.1-3 00:07:10.829 http://cunit.sourceforge.net/ 00:07:10.829 00:07:10.829 00:07:10.829 Suite: pci 00:07:10.829 Test: pci_hook ...[2024-07-24 13:19:29.573520] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 3147273 has claimed it 00:07:10.829 EAL: Cannot find device (10000:00:01.0) 00:07:10.829 EAL: Failed to attach device on primary process 00:07:10.829 passed 00:07:10.829 00:07:10.829 Run Summary: Type Total Ran Passed Failed Inactive 00:07:10.829 suites 1 1 n/a 0 0 00:07:10.829 tests 1 1 1 0 0 00:07:10.830 asserts 25 25 25 0 n/a 00:07:10.830 00:07:10.830 Elapsed time = 0.046 seconds 00:07:10.830 00:07:10.830 real 0m0.067s 00:07:10.830 user 0m0.014s 00:07:10.830 sys 0m0.053s 00:07:10.830 13:19:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:10.830 13:19:29 -- common/autotest_common.sh@10 -- # set +x 00:07:10.830 ************************************ 00:07:10.830 END TEST env_pci 00:07:10.830 ************************************ 00:07:10.830 13:19:29 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:07:10.830 13:19:29 -- env/env.sh@15 -- # uname 00:07:10.830 13:19:29 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:07:10.830 13:19:29 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:07:10.830 13:19:29 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:07:10.830 13:19:29 -- common/autotest_common.sh@1077 -- # '[' 5 -le 1 ']' 00:07:10.830 13:19:29 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:10.830 13:19:29 -- common/autotest_common.sh@10 -- # set +x 00:07:10.830 ************************************ 00:07:10.830 START TEST env_dpdk_post_init 00:07:10.830 ************************************ 00:07:10.830 13:19:29 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:07:11.089 EAL: Detected CPU lcores: 72 00:07:11.089 EAL: Detected NUMA nodes: 2 00:07:11.089 EAL: Detected static linkage of DPDK 00:07:11.089 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:07:11.089 EAL: Selected IOVA mode 'VA' 00:07:11.089 EAL: No free 2048 kB hugepages reported on node 1 00:07:11.089 EAL: VFIO support initialized 00:07:11.089 TELEMETRY: No legacy callbacks, legacy socket not created 00:07:11.089 EAL: Using IOMMU type 1 (Type 1) 00:07:12.026 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:1a:00.0 (socket 0) 00:07:17.294 EAL: Releasing PCI mapped resource for 0000:1a:00.0 00:07:17.294 EAL: Calling pci_unmap_resource for 0000:1a:00.0 at 0x202001000000 00:07:17.553 Starting DPDK initialization... 00:07:17.553 Starting SPDK post initialization... 00:07:17.553 SPDK NVMe probe 00:07:17.553 Attaching to 0000:1a:00.0 00:07:17.553 Attached to 0000:1a:00.0 00:07:17.553 Cleaning up... 00:07:17.553 00:07:17.553 real 0m6.514s 00:07:17.553 user 0m4.840s 00:07:17.553 sys 0m0.922s 00:07:17.553 13:19:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:17.553 13:19:36 -- common/autotest_common.sh@10 -- # set +x 00:07:17.553 ************************************ 00:07:17.553 END TEST env_dpdk_post_init 00:07:17.553 ************************************ 00:07:17.553 13:19:36 -- env/env.sh@26 -- # uname 00:07:17.553 13:19:36 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:07:17.553 13:19:36 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:07:17.553 13:19:36 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:17.553 13:19:36 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:17.553 13:19:36 -- common/autotest_common.sh@10 -- # set +x 00:07:17.553 ************************************ 00:07:17.553 START TEST env_mem_callbacks 00:07:17.553 ************************************ 00:07:17.553 13:19:36 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:07:17.553 EAL: Detected CPU lcores: 72 00:07:17.553 EAL: Detected NUMA nodes: 2 00:07:17.553 EAL: Detected static linkage of DPDK 00:07:17.553 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:07:17.553 EAL: Selected IOVA mode 'VA' 00:07:17.553 EAL: No free 2048 kB hugepages reported on node 1 00:07:17.553 EAL: VFIO support initialized 00:07:17.553 TELEMETRY: No legacy callbacks, legacy socket not created 00:07:17.553 00:07:17.553 00:07:17.553 CUnit - A unit testing framework for C - Version 2.1-3 00:07:17.553 http://cunit.sourceforge.net/ 00:07:17.553 00:07:17.553 00:07:17.553 Suite: memory 00:07:17.553 Test: test ... 00:07:17.553 register 0x200000200000 2097152 00:07:17.553 malloc 3145728 00:07:17.553 register 0x200000400000 4194304 00:07:17.553 buf 0x200000500000 len 3145728 PASSED 00:07:17.553 malloc 64 00:07:17.553 buf 0x2000004fff40 len 64 PASSED 00:07:17.553 malloc 4194304 00:07:17.553 register 0x200000800000 6291456 00:07:17.553 buf 0x200000a00000 len 4194304 PASSED 00:07:17.553 free 0x200000500000 3145728 00:07:17.553 free 0x2000004fff40 64 00:07:17.553 unregister 0x200000400000 4194304 PASSED 00:07:17.553 free 0x200000a00000 4194304 00:07:17.553 unregister 0x200000800000 6291456 PASSED 00:07:17.553 malloc 8388608 00:07:17.553 register 0x200000400000 10485760 00:07:17.553 buf 0x200000600000 len 8388608 PASSED 00:07:17.553 free 0x200000600000 8388608 00:07:17.553 unregister 0x200000400000 10485760 PASSED 00:07:17.553 passed 00:07:17.553 00:07:17.553 Run Summary: Type Total Ran Passed Failed Inactive 00:07:17.553 suites 1 1 n/a 0 0 00:07:17.553 tests 1 1 1 0 0 00:07:17.553 asserts 15 15 15 0 n/a 00:07:17.553 00:07:17.553 Elapsed time = 0.008 seconds 00:07:17.553 00:07:17.553 real 0m0.090s 00:07:17.553 user 0m0.025s 00:07:17.553 sys 0m0.064s 00:07:17.553 13:19:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:17.553 13:19:36 -- common/autotest_common.sh@10 -- # set +x 00:07:17.553 ************************************ 00:07:17.553 END TEST env_mem_callbacks 00:07:17.554 ************************************ 00:07:17.554 00:07:17.554 real 0m8.491s 00:07:17.554 user 0m5.880s 00:07:17.554 sys 0m1.875s 00:07:17.554 13:19:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:17.554 13:19:36 -- common/autotest_common.sh@10 -- # set +x 00:07:17.554 ************************************ 00:07:17.554 END TEST env 00:07:17.554 ************************************ 00:07:17.813 13:19:36 -- spdk/autotest.sh@176 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:07:17.813 13:19:36 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:17.813 13:19:36 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:17.813 13:19:36 -- common/autotest_common.sh@10 -- # set +x 00:07:17.813 ************************************ 00:07:17.813 START TEST rpc 00:07:17.813 ************************************ 00:07:17.813 13:19:36 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:07:17.813 * Looking for test storage... 00:07:17.813 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:07:17.813 13:19:36 -- rpc/rpc.sh@65 -- # spdk_pid=3148371 00:07:17.813 13:19:36 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:17.813 13:19:36 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:07:17.813 13:19:36 -- rpc/rpc.sh@67 -- # waitforlisten 3148371 00:07:17.813 13:19:36 -- common/autotest_common.sh@819 -- # '[' -z 3148371 ']' 00:07:17.813 13:19:36 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:17.813 13:19:36 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:17.813 13:19:36 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:17.813 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:17.813 13:19:36 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:17.813 13:19:36 -- common/autotest_common.sh@10 -- # set +x 00:07:17.813 [2024-07-24 13:19:36.556415] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:17.813 [2024-07-24 13:19:36.556497] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3148371 ] 00:07:17.813 EAL: No free 2048 kB hugepages reported on node 1 00:07:18.071 [2024-07-24 13:19:36.678819] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:18.072 [2024-07-24 13:19:36.727893] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:18.072 [2024-07-24 13:19:36.728053] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:07:18.072 [2024-07-24 13:19:36.728071] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 3148371' to capture a snapshot of events at runtime. 00:07:18.072 [2024-07-24 13:19:36.728085] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid3148371 for offline analysis/debug. 00:07:18.072 [2024-07-24 13:19:36.728112] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.007 13:19:37 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:19.007 13:19:37 -- common/autotest_common.sh@852 -- # return 0 00:07:19.007 13:19:37 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:07:19.007 13:19:37 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:07:19.007 13:19:37 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:07:19.007 13:19:37 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:07:19.007 13:19:37 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:19.007 13:19:37 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:19.007 13:19:37 -- common/autotest_common.sh@10 -- # set +x 00:07:19.007 ************************************ 00:07:19.007 START TEST rpc_integrity 00:07:19.007 ************************************ 00:07:19.008 13:19:37 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:07:19.008 13:19:37 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:07:19.008 13:19:37 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.008 13:19:37 -- common/autotest_common.sh@10 -- # set +x 00:07:19.008 13:19:37 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.008 13:19:37 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:07:19.008 13:19:37 -- rpc/rpc.sh@13 -- # jq length 00:07:19.008 13:19:37 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:07:19.008 13:19:37 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:07:19.008 13:19:37 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.008 13:19:37 -- common/autotest_common.sh@10 -- # set +x 00:07:19.008 13:19:37 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.008 13:19:37 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:07:19.008 13:19:37 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:07:19.008 13:19:37 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.008 13:19:37 -- common/autotest_common.sh@10 -- # set +x 00:07:19.008 13:19:37 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.008 13:19:37 -- rpc/rpc.sh@16 -- # bdevs='[ 00:07:19.008 { 00:07:19.008 "name": "Malloc0", 00:07:19.008 "aliases": [ 00:07:19.008 "228109ee-f457-430d-9761-b77103e14f6e" 00:07:19.008 ], 00:07:19.008 "product_name": "Malloc disk", 00:07:19.008 "block_size": 512, 00:07:19.008 "num_blocks": 16384, 00:07:19.008 "uuid": "228109ee-f457-430d-9761-b77103e14f6e", 00:07:19.008 "assigned_rate_limits": { 00:07:19.008 "rw_ios_per_sec": 0, 00:07:19.008 "rw_mbytes_per_sec": 0, 00:07:19.008 "r_mbytes_per_sec": 0, 00:07:19.008 "w_mbytes_per_sec": 0 00:07:19.008 }, 00:07:19.008 "claimed": false, 00:07:19.008 "zoned": false, 00:07:19.008 "supported_io_types": { 00:07:19.008 "read": true, 00:07:19.008 "write": true, 00:07:19.008 "unmap": true, 00:07:19.008 "write_zeroes": true, 00:07:19.008 "flush": true, 00:07:19.008 "reset": true, 00:07:19.008 "compare": false, 00:07:19.008 "compare_and_write": false, 00:07:19.008 "abort": true, 00:07:19.008 "nvme_admin": false, 00:07:19.008 "nvme_io": false 00:07:19.008 }, 00:07:19.008 "memory_domains": [ 00:07:19.008 { 00:07:19.008 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:19.008 "dma_device_type": 2 00:07:19.008 } 00:07:19.008 ], 00:07:19.008 "driver_specific": {} 00:07:19.008 } 00:07:19.008 ]' 00:07:19.008 13:19:37 -- rpc/rpc.sh@17 -- # jq length 00:07:19.008 13:19:37 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:07:19.008 13:19:37 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:07:19.008 13:19:37 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.008 13:19:37 -- common/autotest_common.sh@10 -- # set +x 00:07:19.008 [2024-07-24 13:19:37.667703] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:07:19.008 [2024-07-24 13:19:37.667753] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:19.008 [2024-07-24 13:19:37.667776] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x505bf90 00:07:19.008 [2024-07-24 13:19:37.667790] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:19.008 [2024-07-24 13:19:37.668961] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:19.008 [2024-07-24 13:19:37.668995] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:07:19.008 Passthru0 00:07:19.008 13:19:37 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.008 13:19:37 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:07:19.008 13:19:37 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.008 13:19:37 -- common/autotest_common.sh@10 -- # set +x 00:07:19.008 13:19:37 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.008 13:19:37 -- rpc/rpc.sh@20 -- # bdevs='[ 00:07:19.008 { 00:07:19.008 "name": "Malloc0", 00:07:19.008 "aliases": [ 00:07:19.008 "228109ee-f457-430d-9761-b77103e14f6e" 00:07:19.008 ], 00:07:19.008 "product_name": "Malloc disk", 00:07:19.008 "block_size": 512, 00:07:19.008 "num_blocks": 16384, 00:07:19.008 "uuid": "228109ee-f457-430d-9761-b77103e14f6e", 00:07:19.008 "assigned_rate_limits": { 00:07:19.008 "rw_ios_per_sec": 0, 00:07:19.008 "rw_mbytes_per_sec": 0, 00:07:19.008 "r_mbytes_per_sec": 0, 00:07:19.008 "w_mbytes_per_sec": 0 00:07:19.008 }, 00:07:19.008 "claimed": true, 00:07:19.008 "claim_type": "exclusive_write", 00:07:19.008 "zoned": false, 00:07:19.008 "supported_io_types": { 00:07:19.008 "read": true, 00:07:19.008 "write": true, 00:07:19.008 "unmap": true, 00:07:19.008 "write_zeroes": true, 00:07:19.008 "flush": true, 00:07:19.008 "reset": true, 00:07:19.008 "compare": false, 00:07:19.008 "compare_and_write": false, 00:07:19.008 "abort": true, 00:07:19.008 "nvme_admin": false, 00:07:19.008 "nvme_io": false 00:07:19.008 }, 00:07:19.008 "memory_domains": [ 00:07:19.008 { 00:07:19.008 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:19.008 "dma_device_type": 2 00:07:19.008 } 00:07:19.008 ], 00:07:19.008 "driver_specific": {} 00:07:19.008 }, 00:07:19.008 { 00:07:19.008 "name": "Passthru0", 00:07:19.008 "aliases": [ 00:07:19.008 "79cb4f64-90fc-5681-a560-84b374a4ec03" 00:07:19.008 ], 00:07:19.008 "product_name": "passthru", 00:07:19.008 "block_size": 512, 00:07:19.008 "num_blocks": 16384, 00:07:19.008 "uuid": "79cb4f64-90fc-5681-a560-84b374a4ec03", 00:07:19.008 "assigned_rate_limits": { 00:07:19.008 "rw_ios_per_sec": 0, 00:07:19.008 "rw_mbytes_per_sec": 0, 00:07:19.008 "r_mbytes_per_sec": 0, 00:07:19.008 "w_mbytes_per_sec": 0 00:07:19.008 }, 00:07:19.008 "claimed": false, 00:07:19.008 "zoned": false, 00:07:19.008 "supported_io_types": { 00:07:19.008 "read": true, 00:07:19.008 "write": true, 00:07:19.008 "unmap": true, 00:07:19.008 "write_zeroes": true, 00:07:19.008 "flush": true, 00:07:19.008 "reset": true, 00:07:19.008 "compare": false, 00:07:19.008 "compare_and_write": false, 00:07:19.008 "abort": true, 00:07:19.008 "nvme_admin": false, 00:07:19.008 "nvme_io": false 00:07:19.008 }, 00:07:19.008 "memory_domains": [ 00:07:19.008 { 00:07:19.008 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:19.008 "dma_device_type": 2 00:07:19.008 } 00:07:19.008 ], 00:07:19.008 "driver_specific": { 00:07:19.008 "passthru": { 00:07:19.008 "name": "Passthru0", 00:07:19.008 "base_bdev_name": "Malloc0" 00:07:19.008 } 00:07:19.008 } 00:07:19.008 } 00:07:19.008 ]' 00:07:19.008 13:19:37 -- rpc/rpc.sh@21 -- # jq length 00:07:19.008 13:19:37 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:07:19.008 13:19:37 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:07:19.008 13:19:37 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.008 13:19:37 -- common/autotest_common.sh@10 -- # set +x 00:07:19.008 13:19:37 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.008 13:19:37 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:07:19.008 13:19:37 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.008 13:19:37 -- common/autotest_common.sh@10 -- # set +x 00:07:19.008 13:19:37 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.008 13:19:37 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:07:19.008 13:19:37 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.008 13:19:37 -- common/autotest_common.sh@10 -- # set +x 00:07:19.008 13:19:37 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.008 13:19:37 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:07:19.008 13:19:37 -- rpc/rpc.sh@26 -- # jq length 00:07:19.008 13:19:37 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:07:19.008 00:07:19.008 real 0m0.295s 00:07:19.008 user 0m0.181s 00:07:19.008 sys 0m0.042s 00:07:19.008 13:19:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:19.008 13:19:37 -- common/autotest_common.sh@10 -- # set +x 00:07:19.008 ************************************ 00:07:19.008 END TEST rpc_integrity 00:07:19.008 ************************************ 00:07:19.008 13:19:37 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:07:19.008 13:19:37 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:19.008 13:19:37 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:19.008 13:19:37 -- common/autotest_common.sh@10 -- # set +x 00:07:19.008 ************************************ 00:07:19.008 START TEST rpc_plugins 00:07:19.008 ************************************ 00:07:19.008 13:19:37 -- common/autotest_common.sh@1104 -- # rpc_plugins 00:07:19.008 13:19:37 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:07:19.008 13:19:37 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.008 13:19:37 -- common/autotest_common.sh@10 -- # set +x 00:07:19.267 13:19:37 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.267 13:19:37 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:07:19.267 13:19:37 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:07:19.267 13:19:37 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.267 13:19:37 -- common/autotest_common.sh@10 -- # set +x 00:07:19.267 13:19:37 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.267 13:19:37 -- rpc/rpc.sh@31 -- # bdevs='[ 00:07:19.267 { 00:07:19.267 "name": "Malloc1", 00:07:19.267 "aliases": [ 00:07:19.267 "a0cfada4-c0b9-40f5-b68b-f9db5b338158" 00:07:19.267 ], 00:07:19.267 "product_name": "Malloc disk", 00:07:19.267 "block_size": 4096, 00:07:19.267 "num_blocks": 256, 00:07:19.267 "uuid": "a0cfada4-c0b9-40f5-b68b-f9db5b338158", 00:07:19.267 "assigned_rate_limits": { 00:07:19.267 "rw_ios_per_sec": 0, 00:07:19.267 "rw_mbytes_per_sec": 0, 00:07:19.267 "r_mbytes_per_sec": 0, 00:07:19.267 "w_mbytes_per_sec": 0 00:07:19.267 }, 00:07:19.267 "claimed": false, 00:07:19.267 "zoned": false, 00:07:19.267 "supported_io_types": { 00:07:19.267 "read": true, 00:07:19.267 "write": true, 00:07:19.267 "unmap": true, 00:07:19.267 "write_zeroes": true, 00:07:19.267 "flush": true, 00:07:19.267 "reset": true, 00:07:19.267 "compare": false, 00:07:19.267 "compare_and_write": false, 00:07:19.267 "abort": true, 00:07:19.267 "nvme_admin": false, 00:07:19.267 "nvme_io": false 00:07:19.267 }, 00:07:19.267 "memory_domains": [ 00:07:19.267 { 00:07:19.267 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:19.267 "dma_device_type": 2 00:07:19.267 } 00:07:19.267 ], 00:07:19.267 "driver_specific": {} 00:07:19.267 } 00:07:19.267 ]' 00:07:19.267 13:19:37 -- rpc/rpc.sh@32 -- # jq length 00:07:19.267 13:19:37 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:07:19.267 13:19:37 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:07:19.267 13:19:37 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.267 13:19:37 -- common/autotest_common.sh@10 -- # set +x 00:07:19.267 13:19:37 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.267 13:19:37 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:07:19.267 13:19:37 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.267 13:19:37 -- common/autotest_common.sh@10 -- # set +x 00:07:19.267 13:19:37 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.267 13:19:37 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:07:19.267 13:19:37 -- rpc/rpc.sh@36 -- # jq length 00:07:19.267 13:19:38 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:07:19.267 00:07:19.267 real 0m0.148s 00:07:19.267 user 0m0.090s 00:07:19.267 sys 0m0.021s 00:07:19.267 13:19:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:19.267 13:19:38 -- common/autotest_common.sh@10 -- # set +x 00:07:19.267 ************************************ 00:07:19.267 END TEST rpc_plugins 00:07:19.267 ************************************ 00:07:19.267 13:19:38 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:07:19.267 13:19:38 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:19.267 13:19:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:19.267 13:19:38 -- common/autotest_common.sh@10 -- # set +x 00:07:19.267 ************************************ 00:07:19.267 START TEST rpc_trace_cmd_test 00:07:19.267 ************************************ 00:07:19.267 13:19:38 -- common/autotest_common.sh@1104 -- # rpc_trace_cmd_test 00:07:19.267 13:19:38 -- rpc/rpc.sh@40 -- # local info 00:07:19.267 13:19:38 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:07:19.267 13:19:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.267 13:19:38 -- common/autotest_common.sh@10 -- # set +x 00:07:19.267 13:19:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.267 13:19:38 -- rpc/rpc.sh@42 -- # info='{ 00:07:19.267 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid3148371", 00:07:19.267 "tpoint_group_mask": "0x8", 00:07:19.267 "iscsi_conn": { 00:07:19.267 "mask": "0x2", 00:07:19.267 "tpoint_mask": "0x0" 00:07:19.267 }, 00:07:19.267 "scsi": { 00:07:19.267 "mask": "0x4", 00:07:19.267 "tpoint_mask": "0x0" 00:07:19.267 }, 00:07:19.267 "bdev": { 00:07:19.267 "mask": "0x8", 00:07:19.267 "tpoint_mask": "0xffffffffffffffff" 00:07:19.267 }, 00:07:19.267 "nvmf_rdma": { 00:07:19.267 "mask": "0x10", 00:07:19.267 "tpoint_mask": "0x0" 00:07:19.267 }, 00:07:19.267 "nvmf_tcp": { 00:07:19.267 "mask": "0x20", 00:07:19.267 "tpoint_mask": "0x0" 00:07:19.267 }, 00:07:19.267 "ftl": { 00:07:19.267 "mask": "0x40", 00:07:19.267 "tpoint_mask": "0x0" 00:07:19.267 }, 00:07:19.267 "blobfs": { 00:07:19.267 "mask": "0x80", 00:07:19.267 "tpoint_mask": "0x0" 00:07:19.267 }, 00:07:19.267 "dsa": { 00:07:19.267 "mask": "0x200", 00:07:19.267 "tpoint_mask": "0x0" 00:07:19.267 }, 00:07:19.267 "thread": { 00:07:19.267 "mask": "0x400", 00:07:19.267 "tpoint_mask": "0x0" 00:07:19.267 }, 00:07:19.267 "nvme_pcie": { 00:07:19.267 "mask": "0x800", 00:07:19.267 "tpoint_mask": "0x0" 00:07:19.267 }, 00:07:19.267 "iaa": { 00:07:19.267 "mask": "0x1000", 00:07:19.267 "tpoint_mask": "0x0" 00:07:19.267 }, 00:07:19.267 "nvme_tcp": { 00:07:19.267 "mask": "0x2000", 00:07:19.267 "tpoint_mask": "0x0" 00:07:19.267 }, 00:07:19.267 "bdev_nvme": { 00:07:19.267 "mask": "0x4000", 00:07:19.267 "tpoint_mask": "0x0" 00:07:19.267 } 00:07:19.267 }' 00:07:19.267 13:19:38 -- rpc/rpc.sh@43 -- # jq length 00:07:19.267 13:19:38 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:07:19.267 13:19:38 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:07:19.526 13:19:38 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:07:19.526 13:19:38 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:07:19.526 13:19:38 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:07:19.526 13:19:38 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:07:19.526 13:19:38 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:07:19.526 13:19:38 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:07:19.526 13:19:38 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:07:19.526 00:07:19.526 real 0m0.236s 00:07:19.526 user 0m0.192s 00:07:19.526 sys 0m0.037s 00:07:19.526 13:19:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:19.526 13:19:38 -- common/autotest_common.sh@10 -- # set +x 00:07:19.526 ************************************ 00:07:19.526 END TEST rpc_trace_cmd_test 00:07:19.526 ************************************ 00:07:19.526 13:19:38 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:07:19.526 13:19:38 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:07:19.526 13:19:38 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:07:19.526 13:19:38 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:19.526 13:19:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:19.526 13:19:38 -- common/autotest_common.sh@10 -- # set +x 00:07:19.526 ************************************ 00:07:19.526 START TEST rpc_daemon_integrity 00:07:19.526 ************************************ 00:07:19.526 13:19:38 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:07:19.526 13:19:38 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:07:19.526 13:19:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.526 13:19:38 -- common/autotest_common.sh@10 -- # set +x 00:07:19.526 13:19:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.526 13:19:38 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:07:19.526 13:19:38 -- rpc/rpc.sh@13 -- # jq length 00:07:19.785 13:19:38 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:07:19.785 13:19:38 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:07:19.785 13:19:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.785 13:19:38 -- common/autotest_common.sh@10 -- # set +x 00:07:19.785 13:19:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.785 13:19:38 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:07:19.785 13:19:38 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:07:19.785 13:19:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.785 13:19:38 -- common/autotest_common.sh@10 -- # set +x 00:07:19.785 13:19:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.785 13:19:38 -- rpc/rpc.sh@16 -- # bdevs='[ 00:07:19.785 { 00:07:19.785 "name": "Malloc2", 00:07:19.785 "aliases": [ 00:07:19.785 "158e99df-86c3-4c31-bcff-22d8b81f83c7" 00:07:19.785 ], 00:07:19.785 "product_name": "Malloc disk", 00:07:19.785 "block_size": 512, 00:07:19.785 "num_blocks": 16384, 00:07:19.785 "uuid": "158e99df-86c3-4c31-bcff-22d8b81f83c7", 00:07:19.785 "assigned_rate_limits": { 00:07:19.785 "rw_ios_per_sec": 0, 00:07:19.785 "rw_mbytes_per_sec": 0, 00:07:19.785 "r_mbytes_per_sec": 0, 00:07:19.785 "w_mbytes_per_sec": 0 00:07:19.785 }, 00:07:19.785 "claimed": false, 00:07:19.785 "zoned": false, 00:07:19.785 "supported_io_types": { 00:07:19.785 "read": true, 00:07:19.785 "write": true, 00:07:19.785 "unmap": true, 00:07:19.785 "write_zeroes": true, 00:07:19.785 "flush": true, 00:07:19.785 "reset": true, 00:07:19.785 "compare": false, 00:07:19.785 "compare_and_write": false, 00:07:19.785 "abort": true, 00:07:19.785 "nvme_admin": false, 00:07:19.785 "nvme_io": false 00:07:19.785 }, 00:07:19.785 "memory_domains": [ 00:07:19.785 { 00:07:19.785 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:19.785 "dma_device_type": 2 00:07:19.785 } 00:07:19.785 ], 00:07:19.785 "driver_specific": {} 00:07:19.785 } 00:07:19.785 ]' 00:07:19.785 13:19:38 -- rpc/rpc.sh@17 -- # jq length 00:07:19.785 13:19:38 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:07:19.785 13:19:38 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:07:19.785 13:19:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.785 13:19:38 -- common/autotest_common.sh@10 -- # set +x 00:07:19.785 [2024-07-24 13:19:38.481873] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:07:19.785 [2024-07-24 13:19:38.481917] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:19.785 [2024-07-24 13:19:38.481939] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x505b7b0 00:07:19.785 [2024-07-24 13:19:38.481954] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:19.785 [2024-07-24 13:19:38.482932] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:19.785 [2024-07-24 13:19:38.482962] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:07:19.785 Passthru0 00:07:19.785 13:19:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.785 13:19:38 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:07:19.785 13:19:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.785 13:19:38 -- common/autotest_common.sh@10 -- # set +x 00:07:19.785 13:19:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.785 13:19:38 -- rpc/rpc.sh@20 -- # bdevs='[ 00:07:19.785 { 00:07:19.785 "name": "Malloc2", 00:07:19.785 "aliases": [ 00:07:19.785 "158e99df-86c3-4c31-bcff-22d8b81f83c7" 00:07:19.785 ], 00:07:19.785 "product_name": "Malloc disk", 00:07:19.785 "block_size": 512, 00:07:19.785 "num_blocks": 16384, 00:07:19.785 "uuid": "158e99df-86c3-4c31-bcff-22d8b81f83c7", 00:07:19.785 "assigned_rate_limits": { 00:07:19.785 "rw_ios_per_sec": 0, 00:07:19.785 "rw_mbytes_per_sec": 0, 00:07:19.785 "r_mbytes_per_sec": 0, 00:07:19.785 "w_mbytes_per_sec": 0 00:07:19.785 }, 00:07:19.785 "claimed": true, 00:07:19.785 "claim_type": "exclusive_write", 00:07:19.785 "zoned": false, 00:07:19.785 "supported_io_types": { 00:07:19.785 "read": true, 00:07:19.785 "write": true, 00:07:19.785 "unmap": true, 00:07:19.785 "write_zeroes": true, 00:07:19.785 "flush": true, 00:07:19.785 "reset": true, 00:07:19.785 "compare": false, 00:07:19.785 "compare_and_write": false, 00:07:19.785 "abort": true, 00:07:19.785 "nvme_admin": false, 00:07:19.785 "nvme_io": false 00:07:19.786 }, 00:07:19.786 "memory_domains": [ 00:07:19.786 { 00:07:19.786 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:19.786 "dma_device_type": 2 00:07:19.786 } 00:07:19.786 ], 00:07:19.786 "driver_specific": {} 00:07:19.786 }, 00:07:19.786 { 00:07:19.786 "name": "Passthru0", 00:07:19.786 "aliases": [ 00:07:19.786 "78a4037b-1c41-5465-988b-f7b739b9e671" 00:07:19.786 ], 00:07:19.786 "product_name": "passthru", 00:07:19.786 "block_size": 512, 00:07:19.786 "num_blocks": 16384, 00:07:19.786 "uuid": "78a4037b-1c41-5465-988b-f7b739b9e671", 00:07:19.786 "assigned_rate_limits": { 00:07:19.786 "rw_ios_per_sec": 0, 00:07:19.786 "rw_mbytes_per_sec": 0, 00:07:19.786 "r_mbytes_per_sec": 0, 00:07:19.786 "w_mbytes_per_sec": 0 00:07:19.786 }, 00:07:19.786 "claimed": false, 00:07:19.786 "zoned": false, 00:07:19.786 "supported_io_types": { 00:07:19.786 "read": true, 00:07:19.786 "write": true, 00:07:19.786 "unmap": true, 00:07:19.786 "write_zeroes": true, 00:07:19.786 "flush": true, 00:07:19.786 "reset": true, 00:07:19.786 "compare": false, 00:07:19.786 "compare_and_write": false, 00:07:19.786 "abort": true, 00:07:19.786 "nvme_admin": false, 00:07:19.786 "nvme_io": false 00:07:19.786 }, 00:07:19.786 "memory_domains": [ 00:07:19.786 { 00:07:19.786 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:19.786 "dma_device_type": 2 00:07:19.786 } 00:07:19.786 ], 00:07:19.786 "driver_specific": { 00:07:19.786 "passthru": { 00:07:19.786 "name": "Passthru0", 00:07:19.786 "base_bdev_name": "Malloc2" 00:07:19.786 } 00:07:19.786 } 00:07:19.786 } 00:07:19.786 ]' 00:07:19.786 13:19:38 -- rpc/rpc.sh@21 -- # jq length 00:07:19.786 13:19:38 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:07:19.786 13:19:38 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:07:19.786 13:19:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.786 13:19:38 -- common/autotest_common.sh@10 -- # set +x 00:07:19.786 13:19:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.786 13:19:38 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:07:19.786 13:19:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.786 13:19:38 -- common/autotest_common.sh@10 -- # set +x 00:07:19.786 13:19:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.786 13:19:38 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:07:19.786 13:19:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.786 13:19:38 -- common/autotest_common.sh@10 -- # set +x 00:07:19.786 13:19:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.786 13:19:38 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:07:19.786 13:19:38 -- rpc/rpc.sh@26 -- # jq length 00:07:19.786 13:19:38 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:07:19.786 00:07:19.786 real 0m0.293s 00:07:19.786 user 0m0.180s 00:07:19.786 sys 0m0.046s 00:07:19.786 13:19:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:19.786 13:19:38 -- common/autotest_common.sh@10 -- # set +x 00:07:19.786 ************************************ 00:07:19.786 END TEST rpc_daemon_integrity 00:07:19.786 ************************************ 00:07:20.045 13:19:38 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:07:20.045 13:19:38 -- rpc/rpc.sh@84 -- # killprocess 3148371 00:07:20.045 13:19:38 -- common/autotest_common.sh@926 -- # '[' -z 3148371 ']' 00:07:20.045 13:19:38 -- common/autotest_common.sh@930 -- # kill -0 3148371 00:07:20.045 13:19:38 -- common/autotest_common.sh@931 -- # uname 00:07:20.045 13:19:38 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:20.045 13:19:38 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3148371 00:07:20.045 13:19:38 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:20.045 13:19:38 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:20.045 13:19:38 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3148371' 00:07:20.045 killing process with pid 3148371 00:07:20.045 13:19:38 -- common/autotest_common.sh@945 -- # kill 3148371 00:07:20.045 13:19:38 -- common/autotest_common.sh@950 -- # wait 3148371 00:07:20.305 00:07:20.305 real 0m2.654s 00:07:20.305 user 0m3.353s 00:07:20.305 sys 0m0.821s 00:07:20.305 13:19:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:20.305 13:19:39 -- common/autotest_common.sh@10 -- # set +x 00:07:20.305 ************************************ 00:07:20.305 END TEST rpc 00:07:20.305 ************************************ 00:07:20.305 13:19:39 -- spdk/autotest.sh@177 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:20.305 13:19:39 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:20.305 13:19:39 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:20.305 13:19:39 -- common/autotest_common.sh@10 -- # set +x 00:07:20.305 ************************************ 00:07:20.305 START TEST rpc_client 00:07:20.305 ************************************ 00:07:20.305 13:19:39 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:20.564 * Looking for test storage... 00:07:20.564 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:07:20.564 13:19:39 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:07:20.564 OK 00:07:20.564 13:19:39 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:07:20.564 00:07:20.564 real 0m0.128s 00:07:20.564 user 0m0.056s 00:07:20.564 sys 0m0.081s 00:07:20.564 13:19:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:20.564 13:19:39 -- common/autotest_common.sh@10 -- # set +x 00:07:20.564 ************************************ 00:07:20.564 END TEST rpc_client 00:07:20.564 ************************************ 00:07:20.564 13:19:39 -- spdk/autotest.sh@178 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:07:20.564 13:19:39 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:20.564 13:19:39 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:20.564 13:19:39 -- common/autotest_common.sh@10 -- # set +x 00:07:20.564 ************************************ 00:07:20.564 START TEST json_config 00:07:20.564 ************************************ 00:07:20.564 13:19:39 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:07:20.564 13:19:39 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:07:20.564 13:19:39 -- nvmf/common.sh@7 -- # uname -s 00:07:20.564 13:19:39 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:20.564 13:19:39 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:20.564 13:19:39 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:20.564 13:19:39 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:20.564 13:19:39 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:20.564 13:19:39 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:20.564 13:19:39 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:20.564 13:19:39 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:20.564 13:19:39 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:20.564 13:19:39 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:20.564 13:19:39 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8023d868-666a-e711-906e-0017a4403562 00:07:20.564 13:19:39 -- nvmf/common.sh@18 -- # NVME_HOSTID=8023d868-666a-e711-906e-0017a4403562 00:07:20.564 13:19:39 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:20.564 13:19:39 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:20.564 13:19:39 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:20.564 13:19:39 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:20.564 13:19:39 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:20.564 13:19:39 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:20.564 13:19:39 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:20.564 13:19:39 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:20.564 13:19:39 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:20.565 13:19:39 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:20.565 13:19:39 -- paths/export.sh@5 -- # export PATH 00:07:20.565 13:19:39 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:20.565 13:19:39 -- nvmf/common.sh@46 -- # : 0 00:07:20.565 13:19:39 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:07:20.565 13:19:39 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:07:20.565 13:19:39 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:07:20.565 13:19:39 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:20.565 13:19:39 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:20.565 13:19:39 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:07:20.565 13:19:39 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:07:20.565 13:19:39 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:07:20.565 13:19:39 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:07:20.565 13:19:39 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:07:20.565 13:19:39 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:07:20.565 13:19:39 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:07:20.565 13:19:39 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:07:20.565 WARNING: No tests are enabled so not running JSON configuration tests 00:07:20.565 13:19:39 -- json_config/json_config.sh@27 -- # exit 0 00:07:20.565 00:07:20.565 real 0m0.102s 00:07:20.565 user 0m0.055s 00:07:20.565 sys 0m0.048s 00:07:20.565 13:19:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:20.565 13:19:39 -- common/autotest_common.sh@10 -- # set +x 00:07:20.565 ************************************ 00:07:20.565 END TEST json_config 00:07:20.565 ************************************ 00:07:20.824 13:19:39 -- spdk/autotest.sh@179 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:20.824 13:19:39 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:20.824 13:19:39 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:20.824 13:19:39 -- common/autotest_common.sh@10 -- # set +x 00:07:20.824 ************************************ 00:07:20.824 START TEST json_config_extra_key 00:07:20.824 ************************************ 00:07:20.824 13:19:39 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:20.824 13:19:39 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:07:20.824 13:19:39 -- nvmf/common.sh@7 -- # uname -s 00:07:20.824 13:19:39 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:20.824 13:19:39 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:20.824 13:19:39 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:20.824 13:19:39 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:20.824 13:19:39 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:20.824 13:19:39 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:20.824 13:19:39 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:20.824 13:19:39 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:20.824 13:19:39 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:20.824 13:19:39 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:20.824 13:19:39 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8023d868-666a-e711-906e-0017a4403562 00:07:20.824 13:19:39 -- nvmf/common.sh@18 -- # NVME_HOSTID=8023d868-666a-e711-906e-0017a4403562 00:07:20.824 13:19:39 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:20.824 13:19:39 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:20.824 13:19:39 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:20.824 13:19:39 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:20.824 13:19:39 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:20.824 13:19:39 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:20.824 13:19:39 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:20.824 13:19:39 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:20.824 13:19:39 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:20.824 13:19:39 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:20.824 13:19:39 -- paths/export.sh@5 -- # export PATH 00:07:20.824 13:19:39 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:20.824 13:19:39 -- nvmf/common.sh@46 -- # : 0 00:07:20.824 13:19:39 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:07:20.824 13:19:39 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:07:20.824 13:19:39 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:07:20.824 13:19:39 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:20.824 13:19:39 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:20.824 13:19:39 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:07:20.824 13:19:39 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:07:20.824 13:19:39 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:07:20.824 13:19:39 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:07:20.824 13:19:39 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:07:20.824 13:19:39 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:07:20.824 13:19:39 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:07:20.824 13:19:39 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:07:20.824 13:19:39 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:07:20.824 13:19:39 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:07:20.824 13:19:39 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:07:20.824 13:19:39 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:20.824 13:19:39 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:07:20.824 INFO: launching applications... 00:07:20.824 13:19:39 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:07:20.824 13:19:39 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:07:20.824 13:19:39 -- json_config/json_config_extra_key.sh@25 -- # shift 00:07:20.824 13:19:39 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:07:20.824 13:19:39 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:07:20.824 13:19:39 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=3149025 00:07:20.824 13:19:39 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:07:20.824 Waiting for target to run... 00:07:20.824 13:19:39 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 3149025 /var/tmp/spdk_tgt.sock 00:07:20.824 13:19:39 -- common/autotest_common.sh@819 -- # '[' -z 3149025 ']' 00:07:20.824 13:19:39 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:07:20.824 13:19:39 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:20.824 13:19:39 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:20.824 13:19:39 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:20.824 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:20.824 13:19:39 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:20.824 13:19:39 -- common/autotest_common.sh@10 -- # set +x 00:07:20.825 [2024-07-24 13:19:39.579249] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:20.825 [2024-07-24 13:19:39.579353] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3149025 ] 00:07:20.825 EAL: No free 2048 kB hugepages reported on node 1 00:07:21.392 [2024-07-24 13:19:40.112437] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.392 [2024-07-24 13:19:40.142439] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:21.392 [2024-07-24 13:19:40.142581] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.651 13:19:40 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:21.651 13:19:40 -- common/autotest_common.sh@852 -- # return 0 00:07:21.651 13:19:40 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:07:21.651 00:07:21.651 13:19:40 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:07:21.651 INFO: shutting down applications... 00:07:21.651 13:19:40 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:07:21.651 13:19:40 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:07:21.651 13:19:40 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:07:21.651 13:19:40 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 3149025 ]] 00:07:21.651 13:19:40 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 3149025 00:07:21.651 13:19:40 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:07:21.651 13:19:40 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:07:21.651 13:19:40 -- json_config/json_config_extra_key.sh@50 -- # kill -0 3149025 00:07:21.651 13:19:40 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:07:22.220 13:19:40 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:07:22.220 13:19:40 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:07:22.220 13:19:40 -- json_config/json_config_extra_key.sh@50 -- # kill -0 3149025 00:07:22.220 13:19:40 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:07:22.220 13:19:40 -- json_config/json_config_extra_key.sh@52 -- # break 00:07:22.220 13:19:40 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:07:22.220 13:19:40 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:07:22.220 SPDK target shutdown done 00:07:22.220 13:19:40 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:07:22.220 Success 00:07:22.220 00:07:22.220 real 0m1.522s 00:07:22.220 user 0m1.114s 00:07:22.220 sys 0m0.633s 00:07:22.220 13:19:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:22.220 13:19:40 -- common/autotest_common.sh@10 -- # set +x 00:07:22.220 ************************************ 00:07:22.220 END TEST json_config_extra_key 00:07:22.220 ************************************ 00:07:22.220 13:19:41 -- spdk/autotest.sh@180 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:22.220 13:19:41 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:22.220 13:19:41 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:22.220 13:19:41 -- common/autotest_common.sh@10 -- # set +x 00:07:22.220 ************************************ 00:07:22.220 START TEST alias_rpc 00:07:22.220 ************************************ 00:07:22.220 13:19:41 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:22.488 * Looking for test storage... 00:07:22.488 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:07:22.488 13:19:41 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:22.488 13:19:41 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=3149247 00:07:22.488 13:19:41 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:22.488 13:19:41 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 3149247 00:07:22.488 13:19:41 -- common/autotest_common.sh@819 -- # '[' -z 3149247 ']' 00:07:22.488 13:19:41 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:22.488 13:19:41 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:22.488 13:19:41 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:22.488 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:22.488 13:19:41 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:22.488 13:19:41 -- common/autotest_common.sh@10 -- # set +x 00:07:22.488 [2024-07-24 13:19:41.159548] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:22.488 [2024-07-24 13:19:41.159634] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3149247 ] 00:07:22.488 EAL: No free 2048 kB hugepages reported on node 1 00:07:22.488 [2024-07-24 13:19:41.281008] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.488 [2024-07-24 13:19:41.325841] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:22.488 [2024-07-24 13:19:41.325994] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.466 13:19:42 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:23.466 13:19:42 -- common/autotest_common.sh@852 -- # return 0 00:07:23.466 13:19:42 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:07:23.726 13:19:42 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 3149247 00:07:23.726 13:19:42 -- common/autotest_common.sh@926 -- # '[' -z 3149247 ']' 00:07:23.726 13:19:42 -- common/autotest_common.sh@930 -- # kill -0 3149247 00:07:23.726 13:19:42 -- common/autotest_common.sh@931 -- # uname 00:07:23.726 13:19:42 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:23.726 13:19:42 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3149247 00:07:23.726 13:19:42 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:23.726 13:19:42 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:23.726 13:19:42 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3149247' 00:07:23.726 killing process with pid 3149247 00:07:23.726 13:19:42 -- common/autotest_common.sh@945 -- # kill 3149247 00:07:23.726 13:19:42 -- common/autotest_common.sh@950 -- # wait 3149247 00:07:23.984 00:07:23.984 real 0m1.737s 00:07:23.984 user 0m1.912s 00:07:23.984 sys 0m0.556s 00:07:23.984 13:19:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:23.984 13:19:42 -- common/autotest_common.sh@10 -- # set +x 00:07:23.984 ************************************ 00:07:23.984 END TEST alias_rpc 00:07:23.984 ************************************ 00:07:23.984 13:19:42 -- spdk/autotest.sh@182 -- # [[ 0 -eq 0 ]] 00:07:23.984 13:19:42 -- spdk/autotest.sh@183 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:23.984 13:19:42 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:23.984 13:19:42 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:23.984 13:19:42 -- common/autotest_common.sh@10 -- # set +x 00:07:23.984 ************************************ 00:07:23.984 START TEST spdkcli_tcp 00:07:23.985 ************************************ 00:07:23.985 13:19:42 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:24.244 * Looking for test storage... 00:07:24.244 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:07:24.244 13:19:42 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:07:24.244 13:19:42 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:07:24.244 13:19:42 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:07:24.244 13:19:42 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:07:24.244 13:19:42 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:07:24.244 13:19:42 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:07:24.244 13:19:42 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:07:24.244 13:19:42 -- common/autotest_common.sh@712 -- # xtrace_disable 00:07:24.244 13:19:42 -- common/autotest_common.sh@10 -- # set +x 00:07:24.244 13:19:42 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=3149488 00:07:24.244 13:19:42 -- spdkcli/tcp.sh@27 -- # waitforlisten 3149488 00:07:24.244 13:19:42 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:07:24.244 13:19:42 -- common/autotest_common.sh@819 -- # '[' -z 3149488 ']' 00:07:24.244 13:19:42 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:24.244 13:19:42 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:24.244 13:19:42 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:24.244 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:24.244 13:19:42 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:24.244 13:19:42 -- common/autotest_common.sh@10 -- # set +x 00:07:24.244 [2024-07-24 13:19:42.931029] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:24.244 [2024-07-24 13:19:42.931108] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3149488 ] 00:07:24.244 EAL: No free 2048 kB hugepages reported on node 1 00:07:24.244 [2024-07-24 13:19:43.051554] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:24.244 [2024-07-24 13:19:43.097180] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:24.244 [2024-07-24 13:19:43.097392] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:24.244 [2024-07-24 13:19:43.097397] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.181 13:19:43 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:25.181 13:19:43 -- common/autotest_common.sh@852 -- # return 0 00:07:25.181 13:19:43 -- spdkcli/tcp.sh@31 -- # socat_pid=3149669 00:07:25.181 13:19:43 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:07:25.181 13:19:43 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:07:25.440 [ 00:07:25.440 "spdk_get_version", 00:07:25.440 "rpc_get_methods", 00:07:25.440 "trace_get_info", 00:07:25.441 "trace_get_tpoint_group_mask", 00:07:25.441 "trace_disable_tpoint_group", 00:07:25.441 "trace_enable_tpoint_group", 00:07:25.441 "trace_clear_tpoint_mask", 00:07:25.441 "trace_set_tpoint_mask", 00:07:25.441 "vfu_tgt_set_base_path", 00:07:25.441 "framework_get_pci_devices", 00:07:25.441 "framework_get_config", 00:07:25.441 "framework_get_subsystems", 00:07:25.441 "iobuf_get_stats", 00:07:25.441 "iobuf_set_options", 00:07:25.441 "sock_set_default_impl", 00:07:25.441 "sock_impl_set_options", 00:07:25.441 "sock_impl_get_options", 00:07:25.441 "vmd_rescan", 00:07:25.441 "vmd_remove_device", 00:07:25.441 "vmd_enable", 00:07:25.441 "accel_get_stats", 00:07:25.441 "accel_set_options", 00:07:25.441 "accel_set_driver", 00:07:25.441 "accel_crypto_key_destroy", 00:07:25.441 "accel_crypto_keys_get", 00:07:25.441 "accel_crypto_key_create", 00:07:25.441 "accel_assign_opc", 00:07:25.441 "accel_get_module_info", 00:07:25.441 "accel_get_opc_assignments", 00:07:25.441 "notify_get_notifications", 00:07:25.441 "notify_get_types", 00:07:25.441 "bdev_get_histogram", 00:07:25.441 "bdev_enable_histogram", 00:07:25.441 "bdev_set_qos_limit", 00:07:25.441 "bdev_set_qd_sampling_period", 00:07:25.441 "bdev_get_bdevs", 00:07:25.441 "bdev_reset_iostat", 00:07:25.441 "bdev_get_iostat", 00:07:25.441 "bdev_examine", 00:07:25.441 "bdev_wait_for_examine", 00:07:25.441 "bdev_set_options", 00:07:25.441 "scsi_get_devices", 00:07:25.441 "thread_set_cpumask", 00:07:25.441 "framework_get_scheduler", 00:07:25.441 "framework_set_scheduler", 00:07:25.441 "framework_get_reactors", 00:07:25.441 "thread_get_io_channels", 00:07:25.441 "thread_get_pollers", 00:07:25.441 "thread_get_stats", 00:07:25.441 "framework_monitor_context_switch", 00:07:25.441 "spdk_kill_instance", 00:07:25.441 "log_enable_timestamps", 00:07:25.441 "log_get_flags", 00:07:25.441 "log_clear_flag", 00:07:25.441 "log_set_flag", 00:07:25.441 "log_get_level", 00:07:25.441 "log_set_level", 00:07:25.441 "log_get_print_level", 00:07:25.441 "log_set_print_level", 00:07:25.441 "framework_enable_cpumask_locks", 00:07:25.441 "framework_disable_cpumask_locks", 00:07:25.441 "framework_wait_init", 00:07:25.441 "framework_start_init", 00:07:25.441 "virtio_blk_create_transport", 00:07:25.441 "virtio_blk_get_transports", 00:07:25.441 "vhost_controller_set_coalescing", 00:07:25.441 "vhost_get_controllers", 00:07:25.441 "vhost_delete_controller", 00:07:25.441 "vhost_create_blk_controller", 00:07:25.441 "vhost_scsi_controller_remove_target", 00:07:25.441 "vhost_scsi_controller_add_target", 00:07:25.441 "vhost_start_scsi_controller", 00:07:25.441 "vhost_create_scsi_controller", 00:07:25.441 "ublk_recover_disk", 00:07:25.441 "ublk_get_disks", 00:07:25.441 "ublk_stop_disk", 00:07:25.441 "ublk_start_disk", 00:07:25.441 "ublk_destroy_target", 00:07:25.441 "ublk_create_target", 00:07:25.441 "nbd_get_disks", 00:07:25.441 "nbd_stop_disk", 00:07:25.441 "nbd_start_disk", 00:07:25.441 "env_dpdk_get_mem_stats", 00:07:25.441 "nvmf_subsystem_get_listeners", 00:07:25.441 "nvmf_subsystem_get_qpairs", 00:07:25.441 "nvmf_subsystem_get_controllers", 00:07:25.441 "nvmf_get_stats", 00:07:25.441 "nvmf_get_transports", 00:07:25.441 "nvmf_create_transport", 00:07:25.441 "nvmf_get_targets", 00:07:25.441 "nvmf_delete_target", 00:07:25.441 "nvmf_create_target", 00:07:25.441 "nvmf_subsystem_allow_any_host", 00:07:25.441 "nvmf_subsystem_remove_host", 00:07:25.441 "nvmf_subsystem_add_host", 00:07:25.441 "nvmf_subsystem_remove_ns", 00:07:25.441 "nvmf_subsystem_add_ns", 00:07:25.441 "nvmf_subsystem_listener_set_ana_state", 00:07:25.441 "nvmf_discovery_get_referrals", 00:07:25.441 "nvmf_discovery_remove_referral", 00:07:25.441 "nvmf_discovery_add_referral", 00:07:25.441 "nvmf_subsystem_remove_listener", 00:07:25.441 "nvmf_subsystem_add_listener", 00:07:25.441 "nvmf_delete_subsystem", 00:07:25.441 "nvmf_create_subsystem", 00:07:25.441 "nvmf_get_subsystems", 00:07:25.441 "nvmf_set_crdt", 00:07:25.441 "nvmf_set_config", 00:07:25.441 "nvmf_set_max_subsystems", 00:07:25.441 "iscsi_set_options", 00:07:25.441 "iscsi_get_auth_groups", 00:07:25.441 "iscsi_auth_group_remove_secret", 00:07:25.441 "iscsi_auth_group_add_secret", 00:07:25.441 "iscsi_delete_auth_group", 00:07:25.441 "iscsi_create_auth_group", 00:07:25.441 "iscsi_set_discovery_auth", 00:07:25.441 "iscsi_get_options", 00:07:25.441 "iscsi_target_node_request_logout", 00:07:25.441 "iscsi_target_node_set_redirect", 00:07:25.441 "iscsi_target_node_set_auth", 00:07:25.441 "iscsi_target_node_add_lun", 00:07:25.441 "iscsi_get_connections", 00:07:25.441 "iscsi_portal_group_set_auth", 00:07:25.441 "iscsi_start_portal_group", 00:07:25.441 "iscsi_delete_portal_group", 00:07:25.441 "iscsi_create_portal_group", 00:07:25.441 "iscsi_get_portal_groups", 00:07:25.441 "iscsi_delete_target_node", 00:07:25.441 "iscsi_target_node_remove_pg_ig_maps", 00:07:25.441 "iscsi_target_node_add_pg_ig_maps", 00:07:25.441 "iscsi_create_target_node", 00:07:25.441 "iscsi_get_target_nodes", 00:07:25.441 "iscsi_delete_initiator_group", 00:07:25.441 "iscsi_initiator_group_remove_initiators", 00:07:25.441 "iscsi_initiator_group_add_initiators", 00:07:25.441 "iscsi_create_initiator_group", 00:07:25.441 "iscsi_get_initiator_groups", 00:07:25.441 "vfu_virtio_create_scsi_endpoint", 00:07:25.441 "vfu_virtio_scsi_remove_target", 00:07:25.441 "vfu_virtio_scsi_add_target", 00:07:25.441 "vfu_virtio_create_blk_endpoint", 00:07:25.441 "vfu_virtio_delete_endpoint", 00:07:25.441 "iaa_scan_accel_module", 00:07:25.441 "dsa_scan_accel_module", 00:07:25.441 "ioat_scan_accel_module", 00:07:25.441 "accel_error_inject_error", 00:07:25.441 "bdev_iscsi_delete", 00:07:25.441 "bdev_iscsi_create", 00:07:25.441 "bdev_iscsi_set_options", 00:07:25.441 "bdev_virtio_attach_controller", 00:07:25.441 "bdev_virtio_scsi_get_devices", 00:07:25.441 "bdev_virtio_detach_controller", 00:07:25.441 "bdev_virtio_blk_set_hotplug", 00:07:25.441 "bdev_ftl_set_property", 00:07:25.441 "bdev_ftl_get_properties", 00:07:25.441 "bdev_ftl_get_stats", 00:07:25.441 "bdev_ftl_unmap", 00:07:25.441 "bdev_ftl_unload", 00:07:25.441 "bdev_ftl_delete", 00:07:25.441 "bdev_ftl_load", 00:07:25.441 "bdev_ftl_create", 00:07:25.441 "bdev_aio_delete", 00:07:25.441 "bdev_aio_rescan", 00:07:25.441 "bdev_aio_create", 00:07:25.441 "blobfs_create", 00:07:25.441 "blobfs_detect", 00:07:25.441 "blobfs_set_cache_size", 00:07:25.441 "bdev_zone_block_delete", 00:07:25.441 "bdev_zone_block_create", 00:07:25.441 "bdev_delay_delete", 00:07:25.441 "bdev_delay_create", 00:07:25.441 "bdev_delay_update_latency", 00:07:25.441 "bdev_split_delete", 00:07:25.441 "bdev_split_create", 00:07:25.441 "bdev_error_inject_error", 00:07:25.441 "bdev_error_delete", 00:07:25.441 "bdev_error_create", 00:07:25.441 "bdev_raid_set_options", 00:07:25.441 "bdev_raid_remove_base_bdev", 00:07:25.441 "bdev_raid_add_base_bdev", 00:07:25.441 "bdev_raid_delete", 00:07:25.441 "bdev_raid_create", 00:07:25.441 "bdev_raid_get_bdevs", 00:07:25.441 "bdev_lvol_grow_lvstore", 00:07:25.441 "bdev_lvol_get_lvols", 00:07:25.441 "bdev_lvol_get_lvstores", 00:07:25.441 "bdev_lvol_delete", 00:07:25.441 "bdev_lvol_set_read_only", 00:07:25.441 "bdev_lvol_resize", 00:07:25.441 "bdev_lvol_decouple_parent", 00:07:25.442 "bdev_lvol_inflate", 00:07:25.442 "bdev_lvol_rename", 00:07:25.442 "bdev_lvol_clone_bdev", 00:07:25.442 "bdev_lvol_clone", 00:07:25.442 "bdev_lvol_snapshot", 00:07:25.442 "bdev_lvol_create", 00:07:25.442 "bdev_lvol_delete_lvstore", 00:07:25.442 "bdev_lvol_rename_lvstore", 00:07:25.442 "bdev_lvol_create_lvstore", 00:07:25.442 "bdev_passthru_delete", 00:07:25.442 "bdev_passthru_create", 00:07:25.442 "bdev_nvme_cuse_unregister", 00:07:25.442 "bdev_nvme_cuse_register", 00:07:25.442 "bdev_opal_new_user", 00:07:25.442 "bdev_opal_set_lock_state", 00:07:25.442 "bdev_opal_delete", 00:07:25.442 "bdev_opal_get_info", 00:07:25.442 "bdev_opal_create", 00:07:25.442 "bdev_nvme_opal_revert", 00:07:25.442 "bdev_nvme_opal_init", 00:07:25.442 "bdev_nvme_send_cmd", 00:07:25.442 "bdev_nvme_get_path_iostat", 00:07:25.442 "bdev_nvme_get_mdns_discovery_info", 00:07:25.442 "bdev_nvme_stop_mdns_discovery", 00:07:25.442 "bdev_nvme_start_mdns_discovery", 00:07:25.442 "bdev_nvme_set_multipath_policy", 00:07:25.442 "bdev_nvme_set_preferred_path", 00:07:25.442 "bdev_nvme_get_io_paths", 00:07:25.442 "bdev_nvme_remove_error_injection", 00:07:25.442 "bdev_nvme_add_error_injection", 00:07:25.442 "bdev_nvme_get_discovery_info", 00:07:25.442 "bdev_nvme_stop_discovery", 00:07:25.442 "bdev_nvme_start_discovery", 00:07:25.442 "bdev_nvme_get_controller_health_info", 00:07:25.442 "bdev_nvme_disable_controller", 00:07:25.442 "bdev_nvme_enable_controller", 00:07:25.442 "bdev_nvme_reset_controller", 00:07:25.442 "bdev_nvme_get_transport_statistics", 00:07:25.442 "bdev_nvme_apply_firmware", 00:07:25.442 "bdev_nvme_detach_controller", 00:07:25.442 "bdev_nvme_get_controllers", 00:07:25.442 "bdev_nvme_attach_controller", 00:07:25.442 "bdev_nvme_set_hotplug", 00:07:25.442 "bdev_nvme_set_options", 00:07:25.442 "bdev_null_resize", 00:07:25.442 "bdev_null_delete", 00:07:25.442 "bdev_null_create", 00:07:25.442 "bdev_malloc_delete", 00:07:25.442 "bdev_malloc_create" 00:07:25.442 ] 00:07:25.442 13:19:44 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:07:25.442 13:19:44 -- common/autotest_common.sh@718 -- # xtrace_disable 00:07:25.442 13:19:44 -- common/autotest_common.sh@10 -- # set +x 00:07:25.442 13:19:44 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:07:25.442 13:19:44 -- spdkcli/tcp.sh@38 -- # killprocess 3149488 00:07:25.442 13:19:44 -- common/autotest_common.sh@926 -- # '[' -z 3149488 ']' 00:07:25.442 13:19:44 -- common/autotest_common.sh@930 -- # kill -0 3149488 00:07:25.442 13:19:44 -- common/autotest_common.sh@931 -- # uname 00:07:25.442 13:19:44 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:25.442 13:19:44 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3149488 00:07:25.442 13:19:44 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:25.442 13:19:44 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:25.442 13:19:44 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3149488' 00:07:25.442 killing process with pid 3149488 00:07:25.442 13:19:44 -- common/autotest_common.sh@945 -- # kill 3149488 00:07:25.442 13:19:44 -- common/autotest_common.sh@950 -- # wait 3149488 00:07:25.701 00:07:25.701 real 0m1.728s 00:07:25.701 user 0m3.345s 00:07:25.701 sys 0m0.536s 00:07:25.701 13:19:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:25.701 13:19:44 -- common/autotest_common.sh@10 -- # set +x 00:07:25.701 ************************************ 00:07:25.701 END TEST spdkcli_tcp 00:07:25.701 ************************************ 00:07:25.961 13:19:44 -- spdk/autotest.sh@186 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:25.961 13:19:44 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:25.961 13:19:44 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:25.961 13:19:44 -- common/autotest_common.sh@10 -- # set +x 00:07:25.961 ************************************ 00:07:25.961 START TEST dpdk_mem_utility 00:07:25.961 ************************************ 00:07:25.961 13:19:44 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:25.961 * Looking for test storage... 00:07:25.961 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:07:25.961 13:19:44 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:25.961 13:19:44 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=3149897 00:07:25.961 13:19:44 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 3149897 00:07:25.961 13:19:44 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:25.961 13:19:44 -- common/autotest_common.sh@819 -- # '[' -z 3149897 ']' 00:07:25.961 13:19:44 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:25.961 13:19:44 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:25.961 13:19:44 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:25.961 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:25.961 13:19:44 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:25.961 13:19:44 -- common/autotest_common.sh@10 -- # set +x 00:07:25.961 [2024-07-24 13:19:44.715963] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:25.961 [2024-07-24 13:19:44.716041] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3149897 ] 00:07:25.961 EAL: No free 2048 kB hugepages reported on node 1 00:07:25.961 [2024-07-24 13:19:44.824550] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.220 [2024-07-24 13:19:44.873324] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:26.220 [2024-07-24 13:19:44.873486] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.156 13:19:45 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:27.156 13:19:45 -- common/autotest_common.sh@852 -- # return 0 00:07:27.156 13:19:45 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:07:27.156 13:19:45 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:07:27.156 13:19:45 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:27.156 13:19:45 -- common/autotest_common.sh@10 -- # set +x 00:07:27.156 { 00:07:27.156 "filename": "/tmp/spdk_mem_dump.txt" 00:07:27.156 } 00:07:27.156 13:19:45 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:27.156 13:19:45 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:27.156 DPDK memory size 814.000000 MiB in 1 heap(s) 00:07:27.156 1 heaps totaling size 814.000000 MiB 00:07:27.156 size: 814.000000 MiB heap id: 0 00:07:27.156 end heaps---------- 00:07:27.156 8 mempools totaling size 598.116089 MiB 00:07:27.156 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:07:27.156 size: 158.602051 MiB name: PDU_data_out_Pool 00:07:27.156 size: 84.521057 MiB name: bdev_io_3149897 00:07:27.156 size: 51.011292 MiB name: evtpool_3149897 00:07:27.156 size: 50.003479 MiB name: msgpool_3149897 00:07:27.156 size: 21.763794 MiB name: PDU_Pool 00:07:27.156 size: 19.513306 MiB name: SCSI_TASK_Pool 00:07:27.156 size: 0.026123 MiB name: Session_Pool 00:07:27.156 end mempools------- 00:07:27.156 6 memzones totaling size 4.142822 MiB 00:07:27.156 size: 1.000366 MiB name: RG_ring_0_3149897 00:07:27.156 size: 1.000366 MiB name: RG_ring_1_3149897 00:07:27.156 size: 1.000366 MiB name: RG_ring_4_3149897 00:07:27.156 size: 1.000366 MiB name: RG_ring_5_3149897 00:07:27.156 size: 0.125366 MiB name: RG_ring_2_3149897 00:07:27.156 size: 0.015991 MiB name: RG_ring_3_3149897 00:07:27.156 end memzones------- 00:07:27.156 13:19:45 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:07:27.156 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:07:27.156 list of free elements. size: 12.519348 MiB 00:07:27.156 element at address: 0x200000400000 with size: 1.999512 MiB 00:07:27.156 element at address: 0x200018e00000 with size: 0.999878 MiB 00:07:27.156 element at address: 0x200019000000 with size: 0.999878 MiB 00:07:27.156 element at address: 0x200003e00000 with size: 0.996277 MiB 00:07:27.156 element at address: 0x200031c00000 with size: 0.994446 MiB 00:07:27.156 element at address: 0x200013800000 with size: 0.978699 MiB 00:07:27.156 element at address: 0x200007000000 with size: 0.959839 MiB 00:07:27.156 element at address: 0x200019200000 with size: 0.936584 MiB 00:07:27.156 element at address: 0x200000200000 with size: 0.841614 MiB 00:07:27.156 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:07:27.156 element at address: 0x20000b200000 with size: 0.490723 MiB 00:07:27.156 element at address: 0x200000800000 with size: 0.487793 MiB 00:07:27.156 element at address: 0x200019400000 with size: 0.485657 MiB 00:07:27.156 element at address: 0x200027e00000 with size: 0.410034 MiB 00:07:27.156 element at address: 0x200003a00000 with size: 0.355530 MiB 00:07:27.156 list of standard malloc elements. size: 199.218079 MiB 00:07:27.156 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:07:27.156 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:07:27.156 element at address: 0x200018efff80 with size: 1.000122 MiB 00:07:27.156 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:07:27.156 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:07:27.156 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:07:27.156 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:07:27.156 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:07:27.156 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:07:27.156 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:07:27.156 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:07:27.156 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:07:27.156 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:07:27.157 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:07:27.157 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:07:27.157 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:07:27.157 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:07:27.157 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:07:27.157 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:07:27.157 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:07:27.157 element at address: 0x200003adb300 with size: 0.000183 MiB 00:07:27.157 element at address: 0x200003adb500 with size: 0.000183 MiB 00:07:27.157 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:07:27.157 element at address: 0x200003affa80 with size: 0.000183 MiB 00:07:27.157 element at address: 0x200003affb40 with size: 0.000183 MiB 00:07:27.157 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:07:27.157 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:07:27.157 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:07:27.157 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:07:27.157 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:07:27.157 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:07:27.157 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:07:27.157 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:07:27.157 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:07:27.157 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:07:27.157 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:07:27.157 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:07:27.157 element at address: 0x200027e69040 with size: 0.000183 MiB 00:07:27.157 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:07:27.157 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:07:27.157 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:07:27.157 list of memzone associated elements. size: 602.262573 MiB 00:07:27.157 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:07:27.157 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:07:27.157 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:07:27.157 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:07:27.157 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:07:27.157 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_3149897_0 00:07:27.157 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:07:27.157 associated memzone info: size: 48.002930 MiB name: MP_evtpool_3149897_0 00:07:27.157 element at address: 0x200003fff380 with size: 48.003052 MiB 00:07:27.157 associated memzone info: size: 48.002930 MiB name: MP_msgpool_3149897_0 00:07:27.157 element at address: 0x2000195be940 with size: 20.255554 MiB 00:07:27.157 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:07:27.157 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:07:27.157 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:07:27.157 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:07:27.157 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_3149897 00:07:27.157 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:07:27.157 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_3149897 00:07:27.157 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:07:27.157 associated memzone info: size: 1.007996 MiB name: MP_evtpool_3149897 00:07:27.157 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:07:27.157 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:07:27.157 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:07:27.157 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:07:27.157 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:07:27.157 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:07:27.157 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:07:27.157 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:07:27.157 element at address: 0x200003eff180 with size: 1.000488 MiB 00:07:27.157 associated memzone info: size: 1.000366 MiB name: RG_ring_0_3149897 00:07:27.157 element at address: 0x200003affc00 with size: 1.000488 MiB 00:07:27.157 associated memzone info: size: 1.000366 MiB name: RG_ring_1_3149897 00:07:27.157 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:07:27.157 associated memzone info: size: 1.000366 MiB name: RG_ring_4_3149897 00:07:27.157 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:07:27.157 associated memzone info: size: 1.000366 MiB name: RG_ring_5_3149897 00:07:27.157 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:07:27.157 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_3149897 00:07:27.157 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:07:27.157 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:07:27.157 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:07:27.157 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:07:27.157 element at address: 0x20001947c540 with size: 0.250488 MiB 00:07:27.157 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:07:27.157 element at address: 0x200003adf880 with size: 0.125488 MiB 00:07:27.157 associated memzone info: size: 0.125366 MiB name: RG_ring_2_3149897 00:07:27.157 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:07:27.157 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:07:27.157 element at address: 0x200027e69100 with size: 0.023743 MiB 00:07:27.157 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:07:27.157 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:07:27.157 associated memzone info: size: 0.015991 MiB name: RG_ring_3_3149897 00:07:27.157 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:07:27.157 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:07:27.157 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:07:27.157 associated memzone info: size: 0.000183 MiB name: MP_msgpool_3149897 00:07:27.157 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:07:27.157 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_3149897 00:07:27.157 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:07:27.157 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:07:27.157 13:19:45 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:07:27.157 13:19:45 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 3149897 00:07:27.157 13:19:45 -- common/autotest_common.sh@926 -- # '[' -z 3149897 ']' 00:07:27.157 13:19:45 -- common/autotest_common.sh@930 -- # kill -0 3149897 00:07:27.157 13:19:45 -- common/autotest_common.sh@931 -- # uname 00:07:27.157 13:19:45 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:27.157 13:19:45 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3149897 00:07:27.157 13:19:45 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:27.157 13:19:45 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:27.157 13:19:45 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3149897' 00:07:27.157 killing process with pid 3149897 00:07:27.157 13:19:45 -- common/autotest_common.sh@945 -- # kill 3149897 00:07:27.157 13:19:45 -- common/autotest_common.sh@950 -- # wait 3149897 00:07:27.416 00:07:27.416 real 0m1.593s 00:07:27.416 user 0m1.673s 00:07:27.417 sys 0m0.535s 00:07:27.417 13:19:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:27.417 13:19:46 -- common/autotest_common.sh@10 -- # set +x 00:07:27.417 ************************************ 00:07:27.417 END TEST dpdk_mem_utility 00:07:27.417 ************************************ 00:07:27.417 13:19:46 -- spdk/autotest.sh@187 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:07:27.417 13:19:46 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:27.417 13:19:46 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:27.417 13:19:46 -- common/autotest_common.sh@10 -- # set +x 00:07:27.417 ************************************ 00:07:27.417 START TEST event 00:07:27.417 ************************************ 00:07:27.417 13:19:46 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:07:27.675 * Looking for test storage... 00:07:27.675 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:07:27.675 13:19:46 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:27.675 13:19:46 -- bdev/nbd_common.sh@6 -- # set -e 00:07:27.675 13:19:46 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:27.675 13:19:46 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:07:27.676 13:19:46 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:27.676 13:19:46 -- common/autotest_common.sh@10 -- # set +x 00:07:27.676 ************************************ 00:07:27.676 START TEST event_perf 00:07:27.676 ************************************ 00:07:27.676 13:19:46 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:27.676 Running I/O for 1 seconds...[2024-07-24 13:19:46.364741] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:27.676 [2024-07-24 13:19:46.364845] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3150137 ] 00:07:27.676 EAL: No free 2048 kB hugepages reported on node 1 00:07:27.676 [2024-07-24 13:19:46.485611] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:27.676 [2024-07-24 13:19:46.532773] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:27.676 [2024-07-24 13:19:46.532843] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:27.676 [2024-07-24 13:19:46.532844] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.676 [2024-07-24 13:19:46.532798] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:29.052 Running I/O for 1 seconds... 00:07:29.052 lcore 0: 161503 00:07:29.052 lcore 1: 161502 00:07:29.052 lcore 2: 161500 00:07:29.052 lcore 3: 161502 00:07:29.052 done. 00:07:29.052 00:07:29.052 real 0m1.258s 00:07:29.052 user 0m4.111s 00:07:29.052 sys 0m0.139s 00:07:29.052 13:19:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:29.052 13:19:47 -- common/autotest_common.sh@10 -- # set +x 00:07:29.052 ************************************ 00:07:29.052 END TEST event_perf 00:07:29.052 ************************************ 00:07:29.052 13:19:47 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:29.052 13:19:47 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:07:29.052 13:19:47 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:29.052 13:19:47 -- common/autotest_common.sh@10 -- # set +x 00:07:29.052 ************************************ 00:07:29.052 START TEST event_reactor 00:07:29.052 ************************************ 00:07:29.052 13:19:47 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:29.052 [2024-07-24 13:19:47.668745] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:29.052 [2024-07-24 13:19:47.668864] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3150340 ] 00:07:29.052 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.052 [2024-07-24 13:19:47.788263] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.052 [2024-07-24 13:19:47.836039] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.431 test_start 00:07:30.431 oneshot 00:07:30.431 tick 100 00:07:30.431 tick 100 00:07:30.431 tick 250 00:07:30.431 tick 100 00:07:30.431 tick 100 00:07:30.431 tick 100 00:07:30.431 tick 250 00:07:30.431 tick 500 00:07:30.431 tick 100 00:07:30.431 tick 100 00:07:30.431 tick 250 00:07:30.431 tick 100 00:07:30.431 tick 100 00:07:30.431 test_end 00:07:30.431 00:07:30.431 real 0m1.254s 00:07:30.431 user 0m1.120s 00:07:30.431 sys 0m0.127s 00:07:30.431 13:19:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:30.431 13:19:48 -- common/autotest_common.sh@10 -- # set +x 00:07:30.431 ************************************ 00:07:30.431 END TEST event_reactor 00:07:30.431 ************************************ 00:07:30.431 13:19:48 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:30.431 13:19:48 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:07:30.431 13:19:48 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:30.431 13:19:48 -- common/autotest_common.sh@10 -- # set +x 00:07:30.431 ************************************ 00:07:30.431 START TEST event_reactor_perf 00:07:30.431 ************************************ 00:07:30.431 13:19:48 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:30.431 [2024-07-24 13:19:48.965769] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:30.431 [2024-07-24 13:19:48.965864] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3150536 ] 00:07:30.431 EAL: No free 2048 kB hugepages reported on node 1 00:07:30.431 [2024-07-24 13:19:49.086881] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.431 [2024-07-24 13:19:49.133471] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.366 test_start 00:07:31.366 test_end 00:07:31.366 Performance: 552893 events per second 00:07:31.366 00:07:31.366 real 0m1.256s 00:07:31.366 user 0m1.119s 00:07:31.366 sys 0m0.130s 00:07:31.366 13:19:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:31.366 13:19:50 -- common/autotest_common.sh@10 -- # set +x 00:07:31.366 ************************************ 00:07:31.366 END TEST event_reactor_perf 00:07:31.366 ************************************ 00:07:31.625 13:19:50 -- event/event.sh@49 -- # uname -s 00:07:31.625 13:19:50 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:07:31.625 13:19:50 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:31.625 13:19:50 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:31.625 13:19:50 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:31.625 13:19:50 -- common/autotest_common.sh@10 -- # set +x 00:07:31.625 ************************************ 00:07:31.625 START TEST event_scheduler 00:07:31.625 ************************************ 00:07:31.625 13:19:50 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:31.625 * Looking for test storage... 00:07:31.625 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:07:31.625 13:19:50 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:07:31.625 13:19:50 -- scheduler/scheduler.sh@35 -- # scheduler_pid=3150758 00:07:31.625 13:19:50 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:07:31.625 13:19:50 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:07:31.625 13:19:50 -- scheduler/scheduler.sh@37 -- # waitforlisten 3150758 00:07:31.625 13:19:50 -- common/autotest_common.sh@819 -- # '[' -z 3150758 ']' 00:07:31.625 13:19:50 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:31.625 13:19:50 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:31.625 13:19:50 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:31.625 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:31.625 13:19:50 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:31.625 13:19:50 -- common/autotest_common.sh@10 -- # set +x 00:07:31.625 [2024-07-24 13:19:50.375697] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:31.625 [2024-07-24 13:19:50.375759] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3150758 ] 00:07:31.625 EAL: No free 2048 kB hugepages reported on node 1 00:07:31.625 [2024-07-24 13:19:50.451296] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:31.884 [2024-07-24 13:19:50.496103] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.884 [2024-07-24 13:19:50.496183] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:31.884 [2024-07-24 13:19:50.496284] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:31.884 [2024-07-24 13:19:50.496285] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:31.884 13:19:50 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:31.884 13:19:50 -- common/autotest_common.sh@852 -- # return 0 00:07:31.884 13:19:50 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:07:31.884 13:19:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.884 13:19:50 -- common/autotest_common.sh@10 -- # set +x 00:07:31.884 POWER: Env isn't set yet! 00:07:31.884 POWER: Attempting to initialise ACPI cpufreq power management... 00:07:31.884 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:07:31.884 POWER: Cannot set governor of lcore 0 to userspace 00:07:31.884 POWER: Attempting to initialise PSTAT power management... 00:07:31.884 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:07:31.884 POWER: Initialized successfully for lcore 0 power management 00:07:31.884 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:07:31.884 POWER: Initialized successfully for lcore 1 power management 00:07:31.884 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:07:31.884 POWER: Initialized successfully for lcore 2 power management 00:07:31.884 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:07:31.884 POWER: Initialized successfully for lcore 3 power management 00:07:31.884 [2024-07-24 13:19:50.638413] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:07:31.884 [2024-07-24 13:19:50.638428] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:07:31.884 [2024-07-24 13:19:50.638440] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:07:31.884 13:19:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.884 13:19:50 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:07:31.884 13:19:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.884 13:19:50 -- common/autotest_common.sh@10 -- # set +x 00:07:31.884 [2024-07-24 13:19:50.704380] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:07:31.884 13:19:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.884 13:19:50 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:07:31.884 13:19:50 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:31.884 13:19:50 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:31.884 13:19:50 -- common/autotest_common.sh@10 -- # set +x 00:07:31.884 ************************************ 00:07:31.884 START TEST scheduler_create_thread 00:07:31.884 ************************************ 00:07:31.884 13:19:50 -- common/autotest_common.sh@1104 -- # scheduler_create_thread 00:07:31.884 13:19:50 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:07:31.884 13:19:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.884 13:19:50 -- common/autotest_common.sh@10 -- # set +x 00:07:31.884 2 00:07:31.884 13:19:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.884 13:19:50 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:07:31.884 13:19:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.884 13:19:50 -- common/autotest_common.sh@10 -- # set +x 00:07:31.884 3 00:07:31.884 13:19:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.884 13:19:50 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:07:31.884 13:19:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.884 13:19:50 -- common/autotest_common.sh@10 -- # set +x 00:07:31.884 4 00:07:31.884 13:19:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.884 13:19:50 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:07:31.884 13:19:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.884 13:19:50 -- common/autotest_common.sh@10 -- # set +x 00:07:32.143 5 00:07:32.143 13:19:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:32.143 13:19:50 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:07:32.143 13:19:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:32.143 13:19:50 -- common/autotest_common.sh@10 -- # set +x 00:07:32.143 6 00:07:32.143 13:19:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:32.143 13:19:50 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:07:32.143 13:19:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:32.143 13:19:50 -- common/autotest_common.sh@10 -- # set +x 00:07:32.143 7 00:07:32.143 13:19:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:32.143 13:19:50 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:07:32.143 13:19:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:32.143 13:19:50 -- common/autotest_common.sh@10 -- # set +x 00:07:32.143 8 00:07:32.143 13:19:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:32.143 13:19:50 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:07:32.143 13:19:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:32.143 13:19:50 -- common/autotest_common.sh@10 -- # set +x 00:07:32.143 9 00:07:32.143 13:19:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:32.143 13:19:50 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:07:32.143 13:19:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:32.143 13:19:50 -- common/autotest_common.sh@10 -- # set +x 00:07:32.143 10 00:07:32.143 13:19:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:32.143 13:19:50 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:07:32.143 13:19:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:32.143 13:19:50 -- common/autotest_common.sh@10 -- # set +x 00:07:32.143 13:19:50 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:32.143 13:19:50 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:07:32.143 13:19:50 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:07:32.143 13:19:50 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:32.143 13:19:50 -- common/autotest_common.sh@10 -- # set +x 00:07:33.079 13:19:51 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:33.079 13:19:51 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:33.079 13:19:51 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:33.079 13:19:51 -- common/autotest_common.sh@10 -- # set +x 00:07:34.456 13:19:53 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:34.456 13:19:53 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:34.456 13:19:53 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:34.456 13:19:53 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:34.456 13:19:53 -- common/autotest_common.sh@10 -- # set +x 00:07:35.391 13:19:54 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:35.391 00:07:35.391 real 0m3.383s 00:07:35.391 user 0m0.021s 00:07:35.391 sys 0m0.009s 00:07:35.391 13:19:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:35.391 13:19:54 -- common/autotest_common.sh@10 -- # set +x 00:07:35.391 ************************************ 00:07:35.391 END TEST scheduler_create_thread 00:07:35.391 ************************************ 00:07:35.391 13:19:54 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:35.391 13:19:54 -- scheduler/scheduler.sh@46 -- # killprocess 3150758 00:07:35.391 13:19:54 -- common/autotest_common.sh@926 -- # '[' -z 3150758 ']' 00:07:35.391 13:19:54 -- common/autotest_common.sh@930 -- # kill -0 3150758 00:07:35.391 13:19:54 -- common/autotest_common.sh@931 -- # uname 00:07:35.391 13:19:54 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:35.391 13:19:54 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3150758 00:07:35.391 13:19:54 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:07:35.391 13:19:54 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:07:35.391 13:19:54 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3150758' 00:07:35.391 killing process with pid 3150758 00:07:35.391 13:19:54 -- common/autotest_common.sh@945 -- # kill 3150758 00:07:35.391 13:19:54 -- common/autotest_common.sh@950 -- # wait 3150758 00:07:35.649 [2024-07-24 13:19:54.476377] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:35.907 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:07:35.907 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:07:35.907 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:07:35.907 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:07:35.907 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:07:35.907 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:07:35.908 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:07:35.908 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:07:35.908 00:07:35.908 real 0m4.441s 00:07:35.908 user 0m8.012s 00:07:35.908 sys 0m0.375s 00:07:35.908 13:19:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:35.908 13:19:54 -- common/autotest_common.sh@10 -- # set +x 00:07:35.908 ************************************ 00:07:35.908 END TEST event_scheduler 00:07:35.908 ************************************ 00:07:35.908 13:19:54 -- event/event.sh@51 -- # modprobe -n nbd 00:07:35.908 13:19:54 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:35.908 13:19:54 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:35.908 13:19:54 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:35.908 13:19:54 -- common/autotest_common.sh@10 -- # set +x 00:07:35.908 ************************************ 00:07:35.908 START TEST app_repeat 00:07:35.908 ************************************ 00:07:35.908 13:19:54 -- common/autotest_common.sh@1104 -- # app_repeat_test 00:07:35.908 13:19:54 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:35.908 13:19:54 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:35.908 13:19:54 -- event/event.sh@13 -- # local nbd_list 00:07:35.908 13:19:54 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:35.908 13:19:54 -- event/event.sh@14 -- # local bdev_list 00:07:35.908 13:19:54 -- event/event.sh@15 -- # local repeat_times=4 00:07:35.908 13:19:54 -- event/event.sh@17 -- # modprobe nbd 00:07:35.908 13:19:54 -- event/event.sh@19 -- # repeat_pid=3151355 00:07:35.908 13:19:54 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:35.908 13:19:54 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:35.908 13:19:54 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 3151355' 00:07:35.908 Process app_repeat pid: 3151355 00:07:35.908 13:19:54 -- event/event.sh@23 -- # for i in {0..2} 00:07:35.908 13:19:54 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:35.908 spdk_app_start Round 0 00:07:35.908 13:19:54 -- event/event.sh@25 -- # waitforlisten 3151355 /var/tmp/spdk-nbd.sock 00:07:35.908 13:19:54 -- common/autotest_common.sh@819 -- # '[' -z 3151355 ']' 00:07:35.908 13:19:54 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:35.908 13:19:54 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:36.166 13:19:54 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:36.166 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:36.166 13:19:54 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:36.166 13:19:54 -- common/autotest_common.sh@10 -- # set +x 00:07:36.166 [2024-07-24 13:19:54.789204] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:36.166 [2024-07-24 13:19:54.789311] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3151355 ] 00:07:36.166 EAL: No free 2048 kB hugepages reported on node 1 00:07:36.166 [2024-07-24 13:19:54.896253] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:36.166 [2024-07-24 13:19:54.945226] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.166 [2024-07-24 13:19:54.945227] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:37.102 13:19:55 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:37.102 13:19:55 -- common/autotest_common.sh@852 -- # return 0 00:07:37.102 13:19:55 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:37.102 Malloc0 00:07:37.103 13:19:55 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:37.361 Malloc1 00:07:37.361 13:19:56 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:37.361 13:19:56 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:37.361 13:19:56 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:37.361 13:19:56 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:37.361 13:19:56 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:37.361 13:19:56 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:37.361 13:19:56 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:37.361 13:19:56 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:37.361 13:19:56 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:37.361 13:19:56 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:37.361 13:19:56 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:37.361 13:19:56 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:37.361 13:19:56 -- bdev/nbd_common.sh@12 -- # local i 00:07:37.361 13:19:56 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:37.361 13:19:56 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:37.361 13:19:56 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:37.361 /dev/nbd0 00:07:37.361 13:19:56 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:37.361 13:19:56 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:37.361 13:19:56 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:07:37.361 13:19:56 -- common/autotest_common.sh@857 -- # local i 00:07:37.361 13:19:56 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:07:37.361 13:19:56 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:07:37.361 13:19:56 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:07:37.361 13:19:56 -- common/autotest_common.sh@861 -- # break 00:07:37.361 13:19:56 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:07:37.361 13:19:56 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:07:37.361 13:19:56 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:37.361 1+0 records in 00:07:37.361 1+0 records out 00:07:37.361 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000166696 s, 24.6 MB/s 00:07:37.620 13:19:56 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:37.620 13:19:56 -- common/autotest_common.sh@874 -- # size=4096 00:07:37.620 13:19:56 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:37.620 13:19:56 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:07:37.620 13:19:56 -- common/autotest_common.sh@877 -- # return 0 00:07:37.620 13:19:56 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:37.620 13:19:56 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:37.620 13:19:56 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:37.620 /dev/nbd1 00:07:37.620 13:19:56 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:37.620 13:19:56 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:37.879 13:19:56 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:07:37.879 13:19:56 -- common/autotest_common.sh@857 -- # local i 00:07:37.879 13:19:56 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:07:37.879 13:19:56 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:07:37.879 13:19:56 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:07:37.879 13:19:56 -- common/autotest_common.sh@861 -- # break 00:07:37.879 13:19:56 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:07:37.879 13:19:56 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:07:37.879 13:19:56 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:37.879 1+0 records in 00:07:37.879 1+0 records out 00:07:37.879 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265843 s, 15.4 MB/s 00:07:37.879 13:19:56 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:37.879 13:19:56 -- common/autotest_common.sh@874 -- # size=4096 00:07:37.879 13:19:56 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:37.879 13:19:56 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:07:37.879 13:19:56 -- common/autotest_common.sh@877 -- # return 0 00:07:37.879 13:19:56 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:37.879 13:19:56 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:37.879 13:19:56 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:37.879 13:19:56 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:37.879 13:19:56 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:38.138 { 00:07:38.138 "nbd_device": "/dev/nbd0", 00:07:38.138 "bdev_name": "Malloc0" 00:07:38.138 }, 00:07:38.138 { 00:07:38.138 "nbd_device": "/dev/nbd1", 00:07:38.138 "bdev_name": "Malloc1" 00:07:38.138 } 00:07:38.138 ]' 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:38.138 { 00:07:38.138 "nbd_device": "/dev/nbd0", 00:07:38.138 "bdev_name": "Malloc0" 00:07:38.138 }, 00:07:38.138 { 00:07:38.138 "nbd_device": "/dev/nbd1", 00:07:38.138 "bdev_name": "Malloc1" 00:07:38.138 } 00:07:38.138 ]' 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:38.138 /dev/nbd1' 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:38.138 /dev/nbd1' 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@65 -- # count=2 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@66 -- # echo 2 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@95 -- # count=2 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:38.138 256+0 records in 00:07:38.138 256+0 records out 00:07:38.138 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114574 s, 91.5 MB/s 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:38.138 256+0 records in 00:07:38.138 256+0 records out 00:07:38.138 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0263732 s, 39.8 MB/s 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:38.138 256+0 records in 00:07:38.138 256+0 records out 00:07:38.138 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0225302 s, 46.5 MB/s 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@51 -- # local i 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:38.138 13:19:56 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:38.396 13:19:57 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:38.396 13:19:57 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:38.396 13:19:57 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:38.397 13:19:57 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:38.397 13:19:57 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:38.397 13:19:57 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:38.397 13:19:57 -- bdev/nbd_common.sh@41 -- # break 00:07:38.397 13:19:57 -- bdev/nbd_common.sh@45 -- # return 0 00:07:38.397 13:19:57 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:38.397 13:19:57 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:38.662 13:19:57 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:38.662 13:19:57 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:38.662 13:19:57 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:38.662 13:19:57 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:38.662 13:19:57 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:38.662 13:19:57 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:38.662 13:19:57 -- bdev/nbd_common.sh@41 -- # break 00:07:38.662 13:19:57 -- bdev/nbd_common.sh@45 -- # return 0 00:07:38.662 13:19:57 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:38.662 13:19:57 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:38.662 13:19:57 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:38.929 13:19:57 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:38.929 13:19:57 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:38.929 13:19:57 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:38.929 13:19:57 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:38.929 13:19:57 -- bdev/nbd_common.sh@65 -- # echo '' 00:07:38.929 13:19:57 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:38.929 13:19:57 -- bdev/nbd_common.sh@65 -- # true 00:07:38.929 13:19:57 -- bdev/nbd_common.sh@65 -- # count=0 00:07:38.929 13:19:57 -- bdev/nbd_common.sh@66 -- # echo 0 00:07:38.929 13:19:57 -- bdev/nbd_common.sh@104 -- # count=0 00:07:38.929 13:19:57 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:38.929 13:19:57 -- bdev/nbd_common.sh@109 -- # return 0 00:07:38.929 13:19:57 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:39.189 13:19:57 -- event/event.sh@35 -- # sleep 3 00:07:39.189 [2024-07-24 13:19:58.051585] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:39.448 [2024-07-24 13:19:58.098422] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:39.448 [2024-07-24 13:19:58.098427] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.448 [2024-07-24 13:19:58.148639] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:39.448 [2024-07-24 13:19:58.148707] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:41.984 13:20:00 -- event/event.sh@23 -- # for i in {0..2} 00:07:41.985 13:20:00 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:41.985 spdk_app_start Round 1 00:07:41.985 13:20:00 -- event/event.sh@25 -- # waitforlisten 3151355 /var/tmp/spdk-nbd.sock 00:07:41.985 13:20:00 -- common/autotest_common.sh@819 -- # '[' -z 3151355 ']' 00:07:41.985 13:20:00 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:41.985 13:20:00 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:41.985 13:20:00 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:41.985 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:41.985 13:20:00 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:41.985 13:20:00 -- common/autotest_common.sh@10 -- # set +x 00:07:42.244 13:20:01 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:42.244 13:20:01 -- common/autotest_common.sh@852 -- # return 0 00:07:42.244 13:20:01 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:42.504 Malloc0 00:07:42.504 13:20:01 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:42.504 Malloc1 00:07:42.504 13:20:01 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:42.504 13:20:01 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:42.504 13:20:01 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:42.504 13:20:01 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:42.504 13:20:01 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:42.504 13:20:01 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:42.504 13:20:01 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:42.504 13:20:01 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:42.504 13:20:01 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:42.504 13:20:01 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:42.504 13:20:01 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:42.504 13:20:01 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:42.504 13:20:01 -- bdev/nbd_common.sh@12 -- # local i 00:07:42.504 13:20:01 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:42.504 13:20:01 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:42.504 13:20:01 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:42.763 /dev/nbd0 00:07:42.763 13:20:01 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:42.763 13:20:01 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:42.763 13:20:01 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:07:42.763 13:20:01 -- common/autotest_common.sh@857 -- # local i 00:07:42.763 13:20:01 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:07:42.763 13:20:01 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:07:42.763 13:20:01 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:07:42.763 13:20:01 -- common/autotest_common.sh@861 -- # break 00:07:42.763 13:20:01 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:07:42.764 13:20:01 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:07:42.764 13:20:01 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:42.764 1+0 records in 00:07:42.764 1+0 records out 00:07:42.764 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251231 s, 16.3 MB/s 00:07:42.764 13:20:01 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:42.764 13:20:01 -- common/autotest_common.sh@874 -- # size=4096 00:07:42.764 13:20:01 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:42.764 13:20:01 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:07:42.764 13:20:01 -- common/autotest_common.sh@877 -- # return 0 00:07:42.764 13:20:01 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:42.764 13:20:01 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:42.764 13:20:01 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:43.023 /dev/nbd1 00:07:43.023 13:20:01 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:43.023 13:20:01 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:43.023 13:20:01 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:07:43.023 13:20:01 -- common/autotest_common.sh@857 -- # local i 00:07:43.023 13:20:01 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:07:43.023 13:20:01 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:07:43.023 13:20:01 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:07:43.023 13:20:01 -- common/autotest_common.sh@861 -- # break 00:07:43.023 13:20:01 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:07:43.023 13:20:01 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:07:43.023 13:20:01 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:43.023 1+0 records in 00:07:43.023 1+0 records out 00:07:43.023 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250499 s, 16.4 MB/s 00:07:43.023 13:20:01 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:43.023 13:20:01 -- common/autotest_common.sh@874 -- # size=4096 00:07:43.023 13:20:01 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:43.023 13:20:01 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:07:43.023 13:20:01 -- common/autotest_common.sh@877 -- # return 0 00:07:43.023 13:20:01 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:43.023 13:20:01 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:43.023 13:20:01 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:43.023 13:20:01 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:43.023 13:20:01 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:43.282 13:20:01 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:43.282 { 00:07:43.282 "nbd_device": "/dev/nbd0", 00:07:43.282 "bdev_name": "Malloc0" 00:07:43.282 }, 00:07:43.282 { 00:07:43.282 "nbd_device": "/dev/nbd1", 00:07:43.282 "bdev_name": "Malloc1" 00:07:43.282 } 00:07:43.282 ]' 00:07:43.282 13:20:01 -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:43.282 { 00:07:43.282 "nbd_device": "/dev/nbd0", 00:07:43.282 "bdev_name": "Malloc0" 00:07:43.282 }, 00:07:43.282 { 00:07:43.282 "nbd_device": "/dev/nbd1", 00:07:43.282 "bdev_name": "Malloc1" 00:07:43.282 } 00:07:43.282 ]' 00:07:43.282 13:20:01 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:43.282 /dev/nbd1' 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:43.282 /dev/nbd1' 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@65 -- # count=2 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@66 -- # echo 2 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@95 -- # count=2 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:43.282 256+0 records in 00:07:43.282 256+0 records out 00:07:43.282 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0105448 s, 99.4 MB/s 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:43.282 256+0 records in 00:07:43.282 256+0 records out 00:07:43.282 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0307298 s, 34.1 MB/s 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:43.282 256+0 records in 00:07:43.282 256+0 records out 00:07:43.282 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0316538 s, 33.1 MB/s 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:43.282 13:20:02 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:43.542 13:20:02 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:43.542 13:20:02 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:43.542 13:20:02 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:43.542 13:20:02 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:43.542 13:20:02 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:43.542 13:20:02 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:43.542 13:20:02 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:43.542 13:20:02 -- bdev/nbd_common.sh@51 -- # local i 00:07:43.542 13:20:02 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:43.542 13:20:02 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:43.801 13:20:02 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:43.801 13:20:02 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:43.801 13:20:02 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:43.801 13:20:02 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:43.801 13:20:02 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:43.801 13:20:02 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:43.801 13:20:02 -- bdev/nbd_common.sh@41 -- # break 00:07:43.801 13:20:02 -- bdev/nbd_common.sh@45 -- # return 0 00:07:43.801 13:20:02 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:43.801 13:20:02 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:44.059 13:20:02 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:44.059 13:20:02 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:44.059 13:20:02 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:44.059 13:20:02 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:44.059 13:20:02 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:44.059 13:20:02 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:44.059 13:20:02 -- bdev/nbd_common.sh@41 -- # break 00:07:44.059 13:20:02 -- bdev/nbd_common.sh@45 -- # return 0 00:07:44.059 13:20:02 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:44.059 13:20:02 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:44.059 13:20:02 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:44.318 13:20:02 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:44.318 13:20:02 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:44.318 13:20:02 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:44.318 13:20:02 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:44.318 13:20:02 -- bdev/nbd_common.sh@65 -- # echo '' 00:07:44.318 13:20:02 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:44.318 13:20:02 -- bdev/nbd_common.sh@65 -- # true 00:07:44.318 13:20:02 -- bdev/nbd_common.sh@65 -- # count=0 00:07:44.318 13:20:02 -- bdev/nbd_common.sh@66 -- # echo 0 00:07:44.318 13:20:02 -- bdev/nbd_common.sh@104 -- # count=0 00:07:44.318 13:20:02 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:44.318 13:20:02 -- bdev/nbd_common.sh@109 -- # return 0 00:07:44.318 13:20:02 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:44.318 13:20:03 -- event/event.sh@35 -- # sleep 3 00:07:44.577 [2024-07-24 13:20:03.387131] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:44.577 [2024-07-24 13:20:03.432253] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:44.577 [2024-07-24 13:20:03.432257] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.835 [2024-07-24 13:20:03.479006] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:44.835 [2024-07-24 13:20:03.479061] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:47.365 13:20:06 -- event/event.sh@23 -- # for i in {0..2} 00:07:47.365 13:20:06 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:47.365 spdk_app_start Round 2 00:07:47.365 13:20:06 -- event/event.sh@25 -- # waitforlisten 3151355 /var/tmp/spdk-nbd.sock 00:07:47.366 13:20:06 -- common/autotest_common.sh@819 -- # '[' -z 3151355 ']' 00:07:47.366 13:20:06 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:47.366 13:20:06 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:47.366 13:20:06 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:47.366 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:47.366 13:20:06 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:47.366 13:20:06 -- common/autotest_common.sh@10 -- # set +x 00:07:47.624 13:20:06 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:47.624 13:20:06 -- common/autotest_common.sh@852 -- # return 0 00:07:47.624 13:20:06 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:47.883 Malloc0 00:07:47.883 13:20:06 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:48.142 Malloc1 00:07:48.142 13:20:06 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:48.142 13:20:06 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:48.142 13:20:06 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:48.142 13:20:06 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:48.142 13:20:06 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:48.142 13:20:06 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:48.142 13:20:06 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:48.142 13:20:06 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:48.142 13:20:06 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:48.142 13:20:06 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:48.142 13:20:06 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:48.142 13:20:06 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:48.142 13:20:06 -- bdev/nbd_common.sh@12 -- # local i 00:07:48.142 13:20:06 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:48.142 13:20:06 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:48.142 13:20:06 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:48.401 /dev/nbd0 00:07:48.401 13:20:07 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:48.401 13:20:07 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:48.401 13:20:07 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:07:48.401 13:20:07 -- common/autotest_common.sh@857 -- # local i 00:07:48.401 13:20:07 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:07:48.401 13:20:07 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:07:48.401 13:20:07 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:07:48.401 13:20:07 -- common/autotest_common.sh@861 -- # break 00:07:48.401 13:20:07 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:07:48.401 13:20:07 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:07:48.401 13:20:07 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:48.401 1+0 records in 00:07:48.401 1+0 records out 00:07:48.401 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000306089 s, 13.4 MB/s 00:07:48.401 13:20:07 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:48.401 13:20:07 -- common/autotest_common.sh@874 -- # size=4096 00:07:48.401 13:20:07 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:48.401 13:20:07 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:07:48.401 13:20:07 -- common/autotest_common.sh@877 -- # return 0 00:07:48.401 13:20:07 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:48.401 13:20:07 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:48.401 13:20:07 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:48.660 /dev/nbd1 00:07:48.660 13:20:07 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:48.660 13:20:07 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:48.660 13:20:07 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:07:48.660 13:20:07 -- common/autotest_common.sh@857 -- # local i 00:07:48.660 13:20:07 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:07:48.660 13:20:07 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:07:48.660 13:20:07 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:07:48.660 13:20:07 -- common/autotest_common.sh@861 -- # break 00:07:48.660 13:20:07 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:07:48.660 13:20:07 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:07:48.660 13:20:07 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:48.660 1+0 records in 00:07:48.660 1+0 records out 00:07:48.660 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000270056 s, 15.2 MB/s 00:07:48.660 13:20:07 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:48.660 13:20:07 -- common/autotest_common.sh@874 -- # size=4096 00:07:48.660 13:20:07 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:48.660 13:20:07 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:07:48.660 13:20:07 -- common/autotest_common.sh@877 -- # return 0 00:07:48.660 13:20:07 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:48.660 13:20:07 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:48.660 13:20:07 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:48.660 13:20:07 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:48.660 13:20:07 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:48.919 { 00:07:48.919 "nbd_device": "/dev/nbd0", 00:07:48.919 "bdev_name": "Malloc0" 00:07:48.919 }, 00:07:48.919 { 00:07:48.919 "nbd_device": "/dev/nbd1", 00:07:48.919 "bdev_name": "Malloc1" 00:07:48.919 } 00:07:48.919 ]' 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:48.919 { 00:07:48.919 "nbd_device": "/dev/nbd0", 00:07:48.919 "bdev_name": "Malloc0" 00:07:48.919 }, 00:07:48.919 { 00:07:48.919 "nbd_device": "/dev/nbd1", 00:07:48.919 "bdev_name": "Malloc1" 00:07:48.919 } 00:07:48.919 ]' 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:48.919 /dev/nbd1' 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:48.919 /dev/nbd1' 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@65 -- # count=2 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@66 -- # echo 2 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@95 -- # count=2 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:48.919 256+0 records in 00:07:48.919 256+0 records out 00:07:48.919 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.011426 s, 91.8 MB/s 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:48.919 256+0 records in 00:07:48.919 256+0 records out 00:07:48.919 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0307462 s, 34.1 MB/s 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:48.919 256+0 records in 00:07:48.919 256+0 records out 00:07:48.919 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0320359 s, 32.7 MB/s 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@51 -- # local i 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:48.919 13:20:07 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:49.178 13:20:07 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:49.178 13:20:08 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:49.178 13:20:08 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:49.178 13:20:08 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:49.178 13:20:08 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:49.178 13:20:08 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:49.178 13:20:08 -- bdev/nbd_common.sh@41 -- # break 00:07:49.178 13:20:08 -- bdev/nbd_common.sh@45 -- # return 0 00:07:49.178 13:20:08 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:49.178 13:20:08 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:49.436 13:20:08 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:49.436 13:20:08 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:49.436 13:20:08 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:49.436 13:20:08 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:49.436 13:20:08 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:49.436 13:20:08 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:49.436 13:20:08 -- bdev/nbd_common.sh@41 -- # break 00:07:49.436 13:20:08 -- bdev/nbd_common.sh@45 -- # return 0 00:07:49.436 13:20:08 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:49.436 13:20:08 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:49.436 13:20:08 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:49.696 13:20:08 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:49.696 13:20:08 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:49.696 13:20:08 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:49.696 13:20:08 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:49.696 13:20:08 -- bdev/nbd_common.sh@65 -- # echo '' 00:07:49.696 13:20:08 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:49.955 13:20:08 -- bdev/nbd_common.sh@65 -- # true 00:07:49.955 13:20:08 -- bdev/nbd_common.sh@65 -- # count=0 00:07:49.955 13:20:08 -- bdev/nbd_common.sh@66 -- # echo 0 00:07:49.955 13:20:08 -- bdev/nbd_common.sh@104 -- # count=0 00:07:49.955 13:20:08 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:49.955 13:20:08 -- bdev/nbd_common.sh@109 -- # return 0 00:07:49.955 13:20:08 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:49.955 13:20:08 -- event/event.sh@35 -- # sleep 3 00:07:50.214 [2024-07-24 13:20:08.974891] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:50.214 [2024-07-24 13:20:09.023480] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:50.214 [2024-07-24 13:20:09.023483] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.214 [2024-07-24 13:20:09.073760] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:50.214 [2024-07-24 13:20:09.073833] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:53.501 13:20:11 -- event/event.sh@38 -- # waitforlisten 3151355 /var/tmp/spdk-nbd.sock 00:07:53.501 13:20:11 -- common/autotest_common.sh@819 -- # '[' -z 3151355 ']' 00:07:53.501 13:20:11 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:53.501 13:20:11 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:53.501 13:20:11 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:53.501 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:53.501 13:20:11 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:53.501 13:20:11 -- common/autotest_common.sh@10 -- # set +x 00:07:53.501 13:20:11 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:53.501 13:20:11 -- common/autotest_common.sh@852 -- # return 0 00:07:53.501 13:20:11 -- event/event.sh@39 -- # killprocess 3151355 00:07:53.501 13:20:11 -- common/autotest_common.sh@926 -- # '[' -z 3151355 ']' 00:07:53.501 13:20:11 -- common/autotest_common.sh@930 -- # kill -0 3151355 00:07:53.501 13:20:11 -- common/autotest_common.sh@931 -- # uname 00:07:53.501 13:20:12 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:53.501 13:20:12 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3151355 00:07:53.501 13:20:12 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:53.501 13:20:12 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:53.501 13:20:12 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3151355' 00:07:53.501 killing process with pid 3151355 00:07:53.501 13:20:12 -- common/autotest_common.sh@945 -- # kill 3151355 00:07:53.501 13:20:12 -- common/autotest_common.sh@950 -- # wait 3151355 00:07:53.501 spdk_app_start is called in Round 0. 00:07:53.501 Shutdown signal received, stop current app iteration 00:07:53.501 Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 reinitialization... 00:07:53.501 spdk_app_start is called in Round 1. 00:07:53.501 Shutdown signal received, stop current app iteration 00:07:53.501 Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 reinitialization... 00:07:53.501 spdk_app_start is called in Round 2. 00:07:53.501 Shutdown signal received, stop current app iteration 00:07:53.501 Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 reinitialization... 00:07:53.501 spdk_app_start is called in Round 3. 00:07:53.501 Shutdown signal received, stop current app iteration 00:07:53.501 13:20:12 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:53.501 13:20:12 -- event/event.sh@42 -- # return 0 00:07:53.501 00:07:53.501 real 0m17.445s 00:07:53.501 user 0m37.122s 00:07:53.501 sys 0m3.848s 00:07:53.501 13:20:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:53.501 13:20:12 -- common/autotest_common.sh@10 -- # set +x 00:07:53.501 ************************************ 00:07:53.501 END TEST app_repeat 00:07:53.501 ************************************ 00:07:53.501 13:20:12 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:53.501 13:20:12 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:53.501 13:20:12 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:53.501 13:20:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:53.501 13:20:12 -- common/autotest_common.sh@10 -- # set +x 00:07:53.501 ************************************ 00:07:53.501 START TEST cpu_locks 00:07:53.501 ************************************ 00:07:53.501 13:20:12 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:53.501 * Looking for test storage... 00:07:53.501 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:07:53.501 13:20:12 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:07:53.501 13:20:12 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:07:53.501 13:20:12 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:07:53.501 13:20:12 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:07:53.501 13:20:12 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:53.501 13:20:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:53.501 13:20:12 -- common/autotest_common.sh@10 -- # set +x 00:07:53.501 ************************************ 00:07:53.501 START TEST default_locks 00:07:53.501 ************************************ 00:07:53.501 13:20:12 -- common/autotest_common.sh@1104 -- # default_locks 00:07:53.501 13:20:12 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=3153907 00:07:53.501 13:20:12 -- event/cpu_locks.sh@47 -- # waitforlisten 3153907 00:07:53.501 13:20:12 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:53.501 13:20:12 -- common/autotest_common.sh@819 -- # '[' -z 3153907 ']' 00:07:53.501 13:20:12 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:53.501 13:20:12 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:53.501 13:20:12 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:53.501 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:53.501 13:20:12 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:53.501 13:20:12 -- common/autotest_common.sh@10 -- # set +x 00:07:53.760 [2024-07-24 13:20:12.387065] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:53.760 [2024-07-24 13:20:12.387145] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3153907 ] 00:07:53.760 EAL: No free 2048 kB hugepages reported on node 1 00:07:53.760 [2024-07-24 13:20:12.507110] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.760 [2024-07-24 13:20:12.552154] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:53.760 [2024-07-24 13:20:12.552335] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.695 13:20:13 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:54.695 13:20:13 -- common/autotest_common.sh@852 -- # return 0 00:07:54.695 13:20:13 -- event/cpu_locks.sh@49 -- # locks_exist 3153907 00:07:54.695 13:20:13 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:54.695 13:20:13 -- event/cpu_locks.sh@22 -- # lslocks -p 3153907 00:07:55.263 lslocks: write error 00:07:55.263 13:20:13 -- event/cpu_locks.sh@50 -- # killprocess 3153907 00:07:55.263 13:20:13 -- common/autotest_common.sh@926 -- # '[' -z 3153907 ']' 00:07:55.263 13:20:13 -- common/autotest_common.sh@930 -- # kill -0 3153907 00:07:55.263 13:20:13 -- common/autotest_common.sh@931 -- # uname 00:07:55.263 13:20:13 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:55.263 13:20:13 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3153907 00:07:55.263 13:20:13 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:55.263 13:20:13 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:55.263 13:20:13 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3153907' 00:07:55.263 killing process with pid 3153907 00:07:55.263 13:20:13 -- common/autotest_common.sh@945 -- # kill 3153907 00:07:55.263 13:20:13 -- common/autotest_common.sh@950 -- # wait 3153907 00:07:55.522 13:20:14 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 3153907 00:07:55.522 13:20:14 -- common/autotest_common.sh@640 -- # local es=0 00:07:55.522 13:20:14 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 3153907 00:07:55.522 13:20:14 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:07:55.522 13:20:14 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:55.522 13:20:14 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:07:55.522 13:20:14 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:55.522 13:20:14 -- common/autotest_common.sh@643 -- # waitforlisten 3153907 00:07:55.522 13:20:14 -- common/autotest_common.sh@819 -- # '[' -z 3153907 ']' 00:07:55.522 13:20:14 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:55.522 13:20:14 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:55.522 13:20:14 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:55.522 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:55.522 13:20:14 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:55.522 13:20:14 -- common/autotest_common.sh@10 -- # set +x 00:07:55.522 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (3153907) - No such process 00:07:55.522 ERROR: process (pid: 3153907) is no longer running 00:07:55.522 13:20:14 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:55.522 13:20:14 -- common/autotest_common.sh@852 -- # return 1 00:07:55.522 13:20:14 -- common/autotest_common.sh@643 -- # es=1 00:07:55.522 13:20:14 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:07:55.522 13:20:14 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:07:55.522 13:20:14 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:07:55.522 13:20:14 -- event/cpu_locks.sh@54 -- # no_locks 00:07:55.522 13:20:14 -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:55.522 13:20:14 -- event/cpu_locks.sh@26 -- # local lock_files 00:07:55.522 13:20:14 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:55.522 00:07:55.522 real 0m1.910s 00:07:55.522 user 0m1.993s 00:07:55.522 sys 0m0.736s 00:07:55.522 13:20:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:55.522 13:20:14 -- common/autotest_common.sh@10 -- # set +x 00:07:55.522 ************************************ 00:07:55.522 END TEST default_locks 00:07:55.522 ************************************ 00:07:55.523 13:20:14 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:07:55.523 13:20:14 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:55.523 13:20:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:55.523 13:20:14 -- common/autotest_common.sh@10 -- # set +x 00:07:55.523 ************************************ 00:07:55.523 START TEST default_locks_via_rpc 00:07:55.523 ************************************ 00:07:55.523 13:20:14 -- common/autotest_common.sh@1104 -- # default_locks_via_rpc 00:07:55.523 13:20:14 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=3154220 00:07:55.523 13:20:14 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:55.523 13:20:14 -- event/cpu_locks.sh@63 -- # waitforlisten 3154220 00:07:55.523 13:20:14 -- common/autotest_common.sh@819 -- # '[' -z 3154220 ']' 00:07:55.523 13:20:14 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:55.523 13:20:14 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:55.523 13:20:14 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:55.523 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:55.523 13:20:14 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:55.523 13:20:14 -- common/autotest_common.sh@10 -- # set +x 00:07:55.523 [2024-07-24 13:20:14.330603] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:55.523 [2024-07-24 13:20:14.330664] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3154220 ] 00:07:55.523 EAL: No free 2048 kB hugepages reported on node 1 00:07:55.786 [2024-07-24 13:20:14.432998] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.786 [2024-07-24 13:20:14.482806] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:55.786 [2024-07-24 13:20:14.482969] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:56.418 13:20:15 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:56.418 13:20:15 -- common/autotest_common.sh@852 -- # return 0 00:07:56.418 13:20:15 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:07:56.418 13:20:15 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:56.418 13:20:15 -- common/autotest_common.sh@10 -- # set +x 00:07:56.418 13:20:15 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:56.418 13:20:15 -- event/cpu_locks.sh@67 -- # no_locks 00:07:56.418 13:20:15 -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:56.418 13:20:15 -- event/cpu_locks.sh@26 -- # local lock_files 00:07:56.418 13:20:15 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:56.418 13:20:15 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:07:56.418 13:20:15 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:56.418 13:20:15 -- common/autotest_common.sh@10 -- # set +x 00:07:56.418 13:20:15 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:56.418 13:20:15 -- event/cpu_locks.sh@71 -- # locks_exist 3154220 00:07:56.418 13:20:15 -- event/cpu_locks.sh@22 -- # lslocks -p 3154220 00:07:56.418 13:20:15 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:56.677 13:20:15 -- event/cpu_locks.sh@73 -- # killprocess 3154220 00:07:56.677 13:20:15 -- common/autotest_common.sh@926 -- # '[' -z 3154220 ']' 00:07:56.677 13:20:15 -- common/autotest_common.sh@930 -- # kill -0 3154220 00:07:56.677 13:20:15 -- common/autotest_common.sh@931 -- # uname 00:07:56.677 13:20:15 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:56.677 13:20:15 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3154220 00:07:56.677 13:20:15 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:56.677 13:20:15 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:56.677 13:20:15 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3154220' 00:07:56.677 killing process with pid 3154220 00:07:56.677 13:20:15 -- common/autotest_common.sh@945 -- # kill 3154220 00:07:56.677 13:20:15 -- common/autotest_common.sh@950 -- # wait 3154220 00:07:57.244 00:07:57.244 real 0m1.557s 00:07:57.244 user 0m1.620s 00:07:57.244 sys 0m0.530s 00:07:57.244 13:20:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:57.244 13:20:15 -- common/autotest_common.sh@10 -- # set +x 00:07:57.244 ************************************ 00:07:57.244 END TEST default_locks_via_rpc 00:07:57.244 ************************************ 00:07:57.244 13:20:15 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:07:57.244 13:20:15 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:57.244 13:20:15 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:57.244 13:20:15 -- common/autotest_common.sh@10 -- # set +x 00:07:57.244 ************************************ 00:07:57.244 START TEST non_locking_app_on_locked_coremask 00:07:57.244 ************************************ 00:07:57.244 13:20:15 -- common/autotest_common.sh@1104 -- # non_locking_app_on_locked_coremask 00:07:57.244 13:20:15 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=3154496 00:07:57.244 13:20:15 -- event/cpu_locks.sh@81 -- # waitforlisten 3154496 /var/tmp/spdk.sock 00:07:57.244 13:20:15 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:57.244 13:20:15 -- common/autotest_common.sh@819 -- # '[' -z 3154496 ']' 00:07:57.244 13:20:15 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:57.244 13:20:15 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:57.244 13:20:15 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:57.244 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:57.244 13:20:15 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:57.244 13:20:15 -- common/autotest_common.sh@10 -- # set +x 00:07:57.244 [2024-07-24 13:20:15.950660] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:57.244 [2024-07-24 13:20:15.950738] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3154496 ] 00:07:57.244 EAL: No free 2048 kB hugepages reported on node 1 00:07:57.244 [2024-07-24 13:20:16.067297] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.503 [2024-07-24 13:20:16.112860] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:57.503 [2024-07-24 13:20:16.113012] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.070 13:20:16 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:58.070 13:20:16 -- common/autotest_common.sh@852 -- # return 0 00:07:58.070 13:20:16 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=3154547 00:07:58.070 13:20:16 -- event/cpu_locks.sh@85 -- # waitforlisten 3154547 /var/tmp/spdk2.sock 00:07:58.070 13:20:16 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:07:58.070 13:20:16 -- common/autotest_common.sh@819 -- # '[' -z 3154547 ']' 00:07:58.070 13:20:16 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:58.070 13:20:16 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:58.070 13:20:16 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:58.070 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:58.070 13:20:16 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:58.070 13:20:16 -- common/autotest_common.sh@10 -- # set +x 00:07:58.070 [2024-07-24 13:20:16.929784] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:07:58.070 [2024-07-24 13:20:16.929859] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3154547 ] 00:07:58.329 EAL: No free 2048 kB hugepages reported on node 1 00:07:58.329 [2024-07-24 13:20:17.091741] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:58.329 [2024-07-24 13:20:17.091783] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.329 [2024-07-24 13:20:17.191436] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:58.329 [2024-07-24 13:20:17.191590] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.265 13:20:17 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:59.265 13:20:17 -- common/autotest_common.sh@852 -- # return 0 00:07:59.265 13:20:17 -- event/cpu_locks.sh@87 -- # locks_exist 3154496 00:07:59.265 13:20:17 -- event/cpu_locks.sh@22 -- # lslocks -p 3154496 00:07:59.265 13:20:17 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:08:00.201 lslocks: write error 00:08:00.201 13:20:18 -- event/cpu_locks.sh@89 -- # killprocess 3154496 00:08:00.201 13:20:18 -- common/autotest_common.sh@926 -- # '[' -z 3154496 ']' 00:08:00.201 13:20:18 -- common/autotest_common.sh@930 -- # kill -0 3154496 00:08:00.201 13:20:18 -- common/autotest_common.sh@931 -- # uname 00:08:00.201 13:20:18 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:08:00.201 13:20:18 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3154496 00:08:00.201 13:20:19 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:08:00.201 13:20:19 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:08:00.201 13:20:19 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3154496' 00:08:00.201 killing process with pid 3154496 00:08:00.201 13:20:19 -- common/autotest_common.sh@945 -- # kill 3154496 00:08:00.201 13:20:19 -- common/autotest_common.sh@950 -- # wait 3154496 00:08:01.140 13:20:19 -- event/cpu_locks.sh@90 -- # killprocess 3154547 00:08:01.140 13:20:19 -- common/autotest_common.sh@926 -- # '[' -z 3154547 ']' 00:08:01.140 13:20:19 -- common/autotest_common.sh@930 -- # kill -0 3154547 00:08:01.140 13:20:19 -- common/autotest_common.sh@931 -- # uname 00:08:01.140 13:20:19 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:08:01.140 13:20:19 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3154547 00:08:01.140 13:20:19 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:08:01.140 13:20:19 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:08:01.140 13:20:19 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3154547' 00:08:01.140 killing process with pid 3154547 00:08:01.140 13:20:19 -- common/autotest_common.sh@945 -- # kill 3154547 00:08:01.140 13:20:19 -- common/autotest_common.sh@950 -- # wait 3154547 00:08:01.400 00:08:01.400 real 0m4.168s 00:08:01.400 user 0m4.550s 00:08:01.400 sys 0m1.456s 00:08:01.400 13:20:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:01.400 13:20:20 -- common/autotest_common.sh@10 -- # set +x 00:08:01.400 ************************************ 00:08:01.400 END TEST non_locking_app_on_locked_coremask 00:08:01.400 ************************************ 00:08:01.400 13:20:20 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:08:01.400 13:20:20 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:01.400 13:20:20 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:01.400 13:20:20 -- common/autotest_common.sh@10 -- # set +x 00:08:01.400 ************************************ 00:08:01.400 START TEST locking_app_on_unlocked_coremask 00:08:01.400 ************************************ 00:08:01.400 13:20:20 -- common/autotest_common.sh@1104 -- # locking_app_on_unlocked_coremask 00:08:01.400 13:20:20 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=3155072 00:08:01.400 13:20:20 -- event/cpu_locks.sh@99 -- # waitforlisten 3155072 /var/tmp/spdk.sock 00:08:01.400 13:20:20 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:08:01.400 13:20:20 -- common/autotest_common.sh@819 -- # '[' -z 3155072 ']' 00:08:01.400 13:20:20 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:01.400 13:20:20 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:01.400 13:20:20 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:01.400 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:01.400 13:20:20 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:01.400 13:20:20 -- common/autotest_common.sh@10 -- # set +x 00:08:01.400 [2024-07-24 13:20:20.172343] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:01.400 [2024-07-24 13:20:20.172445] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3155072 ] 00:08:01.400 EAL: No free 2048 kB hugepages reported on node 1 00:08:01.659 [2024-07-24 13:20:20.294741] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:08:01.659 [2024-07-24 13:20:20.294779] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.660 [2024-07-24 13:20:20.343920] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:01.660 [2024-07-24 13:20:20.344082] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.227 13:20:21 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:02.227 13:20:21 -- common/autotest_common.sh@852 -- # return 0 00:08:02.227 13:20:21 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=3155171 00:08:02.227 13:20:21 -- event/cpu_locks.sh@103 -- # waitforlisten 3155171 /var/tmp/spdk2.sock 00:08:02.227 13:20:21 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:08:02.227 13:20:21 -- common/autotest_common.sh@819 -- # '[' -z 3155171 ']' 00:08:02.227 13:20:21 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:08:02.227 13:20:21 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:02.227 13:20:21 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:08:02.227 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:08:02.227 13:20:21 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:02.227 13:20:21 -- common/autotest_common.sh@10 -- # set +x 00:08:02.227 [2024-07-24 13:20:21.083087] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:02.227 [2024-07-24 13:20:21.083168] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3155171 ] 00:08:02.487 EAL: No free 2048 kB hugepages reported on node 1 00:08:02.487 [2024-07-24 13:20:21.226866] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.487 [2024-07-24 13:20:21.315508] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:02.487 [2024-07-24 13:20:21.315671] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.423 13:20:22 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:03.423 13:20:22 -- common/autotest_common.sh@852 -- # return 0 00:08:03.423 13:20:22 -- event/cpu_locks.sh@105 -- # locks_exist 3155171 00:08:03.423 13:20:22 -- event/cpu_locks.sh@22 -- # lslocks -p 3155171 00:08:03.423 13:20:22 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:08:04.800 lslocks: write error 00:08:04.800 13:20:23 -- event/cpu_locks.sh@107 -- # killprocess 3155072 00:08:04.800 13:20:23 -- common/autotest_common.sh@926 -- # '[' -z 3155072 ']' 00:08:04.800 13:20:23 -- common/autotest_common.sh@930 -- # kill -0 3155072 00:08:04.800 13:20:23 -- common/autotest_common.sh@931 -- # uname 00:08:04.800 13:20:23 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:08:04.800 13:20:23 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3155072 00:08:04.801 13:20:23 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:08:04.801 13:20:23 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:08:04.801 13:20:23 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3155072' 00:08:04.801 killing process with pid 3155072 00:08:04.801 13:20:23 -- common/autotest_common.sh@945 -- # kill 3155072 00:08:04.801 13:20:23 -- common/autotest_common.sh@950 -- # wait 3155072 00:08:05.368 13:20:24 -- event/cpu_locks.sh@108 -- # killprocess 3155171 00:08:05.368 13:20:24 -- common/autotest_common.sh@926 -- # '[' -z 3155171 ']' 00:08:05.368 13:20:24 -- common/autotest_common.sh@930 -- # kill -0 3155171 00:08:05.368 13:20:24 -- common/autotest_common.sh@931 -- # uname 00:08:05.368 13:20:24 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:08:05.368 13:20:24 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3155171 00:08:05.368 13:20:24 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:08:05.368 13:20:24 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:08:05.368 13:20:24 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3155171' 00:08:05.368 killing process with pid 3155171 00:08:05.368 13:20:24 -- common/autotest_common.sh@945 -- # kill 3155171 00:08:05.368 13:20:24 -- common/autotest_common.sh@950 -- # wait 3155171 00:08:05.937 00:08:05.937 real 0m4.426s 00:08:05.937 user 0m4.735s 00:08:05.937 sys 0m1.550s 00:08:05.937 13:20:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:05.937 13:20:24 -- common/autotest_common.sh@10 -- # set +x 00:08:05.937 ************************************ 00:08:05.937 END TEST locking_app_on_unlocked_coremask 00:08:05.937 ************************************ 00:08:05.937 13:20:24 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:08:05.937 13:20:24 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:05.937 13:20:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:05.937 13:20:24 -- common/autotest_common.sh@10 -- # set +x 00:08:05.937 ************************************ 00:08:05.937 START TEST locking_app_on_locked_coremask 00:08:05.937 ************************************ 00:08:05.937 13:20:24 -- common/autotest_common.sh@1104 -- # locking_app_on_locked_coremask 00:08:05.937 13:20:24 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=3155655 00:08:05.937 13:20:24 -- event/cpu_locks.sh@116 -- # waitforlisten 3155655 /var/tmp/spdk.sock 00:08:05.937 13:20:24 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:08:05.937 13:20:24 -- common/autotest_common.sh@819 -- # '[' -z 3155655 ']' 00:08:05.937 13:20:24 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:05.937 13:20:24 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:05.937 13:20:24 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:05.937 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:05.937 13:20:24 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:05.937 13:20:24 -- common/autotest_common.sh@10 -- # set +x 00:08:05.937 [2024-07-24 13:20:24.649649] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:05.937 [2024-07-24 13:20:24.649726] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3155655 ] 00:08:05.937 EAL: No free 2048 kB hugepages reported on node 1 00:08:05.937 [2024-07-24 13:20:24.755409] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:06.196 [2024-07-24 13:20:24.803129] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:06.196 [2024-07-24 13:20:24.803295] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.763 13:20:25 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:06.764 13:20:25 -- common/autotest_common.sh@852 -- # return 0 00:08:06.764 13:20:25 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=3155835 00:08:06.764 13:20:25 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 3155835 /var/tmp/spdk2.sock 00:08:06.764 13:20:25 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:08:06.764 13:20:25 -- common/autotest_common.sh@640 -- # local es=0 00:08:06.764 13:20:25 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 3155835 /var/tmp/spdk2.sock 00:08:06.764 13:20:25 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:08:06.764 13:20:25 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:06.764 13:20:25 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:08:06.764 13:20:25 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:06.764 13:20:25 -- common/autotest_common.sh@643 -- # waitforlisten 3155835 /var/tmp/spdk2.sock 00:08:06.764 13:20:25 -- common/autotest_common.sh@819 -- # '[' -z 3155835 ']' 00:08:06.764 13:20:25 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:08:06.764 13:20:25 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:06.764 13:20:25 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:08:06.764 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:08:06.764 13:20:25 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:06.764 13:20:25 -- common/autotest_common.sh@10 -- # set +x 00:08:06.764 [2024-07-24 13:20:25.567851] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:06.764 [2024-07-24 13:20:25.567919] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3155835 ] 00:08:06.764 EAL: No free 2048 kB hugepages reported on node 1 00:08:07.022 [2024-07-24 13:20:25.726047] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 3155655 has claimed it. 00:08:07.022 [2024-07-24 13:20:25.726101] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:08:07.590 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (3155835) - No such process 00:08:07.590 ERROR: process (pid: 3155835) is no longer running 00:08:07.590 13:20:26 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:07.590 13:20:26 -- common/autotest_common.sh@852 -- # return 1 00:08:07.590 13:20:26 -- common/autotest_common.sh@643 -- # es=1 00:08:07.590 13:20:26 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:08:07.590 13:20:26 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:08:07.590 13:20:26 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:08:07.590 13:20:26 -- event/cpu_locks.sh@122 -- # locks_exist 3155655 00:08:07.590 13:20:26 -- event/cpu_locks.sh@22 -- # lslocks -p 3155655 00:08:07.590 13:20:26 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:08:08.158 lslocks: write error 00:08:08.158 13:20:26 -- event/cpu_locks.sh@124 -- # killprocess 3155655 00:08:08.158 13:20:26 -- common/autotest_common.sh@926 -- # '[' -z 3155655 ']' 00:08:08.158 13:20:26 -- common/autotest_common.sh@930 -- # kill -0 3155655 00:08:08.158 13:20:26 -- common/autotest_common.sh@931 -- # uname 00:08:08.158 13:20:26 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:08:08.158 13:20:26 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3155655 00:08:08.158 13:20:26 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:08:08.158 13:20:26 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:08:08.158 13:20:26 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3155655' 00:08:08.158 killing process with pid 3155655 00:08:08.158 13:20:26 -- common/autotest_common.sh@945 -- # kill 3155655 00:08:08.158 13:20:26 -- common/autotest_common.sh@950 -- # wait 3155655 00:08:08.726 00:08:08.726 real 0m2.691s 00:08:08.726 user 0m2.944s 00:08:08.726 sys 0m0.881s 00:08:08.726 13:20:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:08.726 13:20:27 -- common/autotest_common.sh@10 -- # set +x 00:08:08.726 ************************************ 00:08:08.726 END TEST locking_app_on_locked_coremask 00:08:08.726 ************************************ 00:08:08.726 13:20:27 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:08:08.726 13:20:27 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:08.726 13:20:27 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:08.726 13:20:27 -- common/autotest_common.sh@10 -- # set +x 00:08:08.726 ************************************ 00:08:08.726 START TEST locking_overlapped_coremask 00:08:08.726 ************************************ 00:08:08.726 13:20:27 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask 00:08:08.726 13:20:27 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=3156052 00:08:08.726 13:20:27 -- event/cpu_locks.sh@133 -- # waitforlisten 3156052 /var/tmp/spdk.sock 00:08:08.726 13:20:27 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:08:08.726 13:20:27 -- common/autotest_common.sh@819 -- # '[' -z 3156052 ']' 00:08:08.726 13:20:27 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:08.726 13:20:27 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:08.726 13:20:27 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:08.726 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:08.726 13:20:27 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:08.726 13:20:27 -- common/autotest_common.sh@10 -- # set +x 00:08:08.726 [2024-07-24 13:20:27.393243] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:08.726 [2024-07-24 13:20:27.393332] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3156052 ] 00:08:08.726 EAL: No free 2048 kB hugepages reported on node 1 00:08:08.726 [2024-07-24 13:20:27.514509] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:08.726 [2024-07-24 13:20:27.561023] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:08.726 [2024-07-24 13:20:27.561224] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:08.726 [2024-07-24 13:20:27.561309] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:08:08.726 [2024-07-24 13:20:27.561313] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:09.664 13:20:28 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:09.664 13:20:28 -- common/autotest_common.sh@852 -- # return 0 00:08:09.664 13:20:28 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=3156226 00:08:09.664 13:20:28 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 3156226 /var/tmp/spdk2.sock 00:08:09.664 13:20:28 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:08:09.664 13:20:28 -- common/autotest_common.sh@640 -- # local es=0 00:08:09.664 13:20:28 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 3156226 /var/tmp/spdk2.sock 00:08:09.664 13:20:28 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:08:09.664 13:20:28 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:09.664 13:20:28 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:08:09.664 13:20:28 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:09.664 13:20:28 -- common/autotest_common.sh@643 -- # waitforlisten 3156226 /var/tmp/spdk2.sock 00:08:09.664 13:20:28 -- common/autotest_common.sh@819 -- # '[' -z 3156226 ']' 00:08:09.664 13:20:28 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:08:09.664 13:20:28 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:09.664 13:20:28 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:08:09.664 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:08:09.664 13:20:28 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:09.664 13:20:28 -- common/autotest_common.sh@10 -- # set +x 00:08:09.664 [2024-07-24 13:20:28.313049] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:09.664 [2024-07-24 13:20:28.313128] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3156226 ] 00:08:09.664 EAL: No free 2048 kB hugepages reported on node 1 00:08:09.664 [2024-07-24 13:20:28.444209] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 3156052 has claimed it. 00:08:09.664 [2024-07-24 13:20:28.444252] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:08:10.233 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (3156226) - No such process 00:08:10.233 ERROR: process (pid: 3156226) is no longer running 00:08:10.233 13:20:28 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:10.233 13:20:28 -- common/autotest_common.sh@852 -- # return 1 00:08:10.233 13:20:28 -- common/autotest_common.sh@643 -- # es=1 00:08:10.233 13:20:28 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:08:10.233 13:20:28 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:08:10.233 13:20:28 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:08:10.233 13:20:28 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:08:10.233 13:20:28 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:08:10.233 13:20:28 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:08:10.233 13:20:28 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:08:10.233 13:20:28 -- event/cpu_locks.sh@141 -- # killprocess 3156052 00:08:10.233 13:20:28 -- common/autotest_common.sh@926 -- # '[' -z 3156052 ']' 00:08:10.233 13:20:28 -- common/autotest_common.sh@930 -- # kill -0 3156052 00:08:10.233 13:20:28 -- common/autotest_common.sh@931 -- # uname 00:08:10.233 13:20:28 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:08:10.233 13:20:28 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3156052 00:08:10.233 13:20:29 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:08:10.233 13:20:29 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:08:10.233 13:20:29 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3156052' 00:08:10.233 killing process with pid 3156052 00:08:10.233 13:20:29 -- common/autotest_common.sh@945 -- # kill 3156052 00:08:10.233 13:20:29 -- common/autotest_common.sh@950 -- # wait 3156052 00:08:10.801 00:08:10.801 real 0m2.003s 00:08:10.801 user 0m5.604s 00:08:10.801 sys 0m0.534s 00:08:10.801 13:20:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:10.801 13:20:29 -- common/autotest_common.sh@10 -- # set +x 00:08:10.801 ************************************ 00:08:10.801 END TEST locking_overlapped_coremask 00:08:10.801 ************************************ 00:08:10.801 13:20:29 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:08:10.801 13:20:29 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:10.801 13:20:29 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:10.801 13:20:29 -- common/autotest_common.sh@10 -- # set +x 00:08:10.801 ************************************ 00:08:10.801 START TEST locking_overlapped_coremask_via_rpc 00:08:10.801 ************************************ 00:08:10.801 13:20:29 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask_via_rpc 00:08:10.801 13:20:29 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=3156436 00:08:10.801 13:20:29 -- event/cpu_locks.sh@149 -- # waitforlisten 3156436 /var/tmp/spdk.sock 00:08:10.801 13:20:29 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:08:10.801 13:20:29 -- common/autotest_common.sh@819 -- # '[' -z 3156436 ']' 00:08:10.801 13:20:29 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:10.801 13:20:29 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:10.801 13:20:29 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:10.801 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:10.801 13:20:29 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:10.801 13:20:29 -- common/autotest_common.sh@10 -- # set +x 00:08:10.801 [2024-07-24 13:20:29.450020] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:10.801 [2024-07-24 13:20:29.450104] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3156436 ] 00:08:10.801 EAL: No free 2048 kB hugepages reported on node 1 00:08:10.801 [2024-07-24 13:20:29.571175] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:08:10.801 [2024-07-24 13:20:29.571220] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:10.801 [2024-07-24 13:20:29.621416] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:10.801 [2024-07-24 13:20:29.621613] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:10.801 [2024-07-24 13:20:29.621696] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:08:10.801 [2024-07-24 13:20:29.621700] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.738 13:20:30 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:11.738 13:20:30 -- common/autotest_common.sh@852 -- # return 0 00:08:11.738 13:20:30 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=3156451 00:08:11.738 13:20:30 -- event/cpu_locks.sh@153 -- # waitforlisten 3156451 /var/tmp/spdk2.sock 00:08:11.738 13:20:30 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:08:11.738 13:20:30 -- common/autotest_common.sh@819 -- # '[' -z 3156451 ']' 00:08:11.738 13:20:30 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:08:11.738 13:20:30 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:11.738 13:20:30 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:08:11.738 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:08:11.738 13:20:30 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:11.738 13:20:30 -- common/autotest_common.sh@10 -- # set +x 00:08:11.738 [2024-07-24 13:20:30.374011] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:11.738 [2024-07-24 13:20:30.374118] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3156451 ] 00:08:11.738 EAL: No free 2048 kB hugepages reported on node 1 00:08:11.738 [2024-07-24 13:20:30.492321] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:08:11.738 [2024-07-24 13:20:30.492352] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:11.738 [2024-07-24 13:20:30.573377] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:11.738 [2024-07-24 13:20:30.573576] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:08:11.738 [2024-07-24 13:20:30.577287] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:08:11.738 [2024-07-24 13:20:30.577289] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:08:12.740 13:20:31 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:12.740 13:20:31 -- common/autotest_common.sh@852 -- # return 0 00:08:12.740 13:20:31 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:08:12.740 13:20:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:08:12.740 13:20:31 -- common/autotest_common.sh@10 -- # set +x 00:08:12.740 13:20:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:08:12.740 13:20:31 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:08:12.740 13:20:31 -- common/autotest_common.sh@640 -- # local es=0 00:08:12.740 13:20:31 -- common/autotest_common.sh@642 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:08:12.740 13:20:31 -- common/autotest_common.sh@628 -- # local arg=rpc_cmd 00:08:12.740 13:20:31 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:12.740 13:20:31 -- common/autotest_common.sh@632 -- # type -t rpc_cmd 00:08:12.740 13:20:31 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:12.740 13:20:31 -- common/autotest_common.sh@643 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:08:12.740 13:20:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:08:12.740 13:20:31 -- common/autotest_common.sh@10 -- # set +x 00:08:12.740 [2024-07-24 13:20:31.354285] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 3156436 has claimed it. 00:08:12.740 request: 00:08:12.740 { 00:08:12.740 "method": "framework_enable_cpumask_locks", 00:08:12.740 "req_id": 1 00:08:12.740 } 00:08:12.740 Got JSON-RPC error response 00:08:12.740 response: 00:08:12.740 { 00:08:12.740 "code": -32603, 00:08:12.740 "message": "Failed to claim CPU core: 2" 00:08:12.740 } 00:08:12.740 13:20:31 -- common/autotest_common.sh@579 -- # [[ 1 == 0 ]] 00:08:12.740 13:20:31 -- common/autotest_common.sh@643 -- # es=1 00:08:12.740 13:20:31 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:08:12.740 13:20:31 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:08:12.740 13:20:31 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:08:12.740 13:20:31 -- event/cpu_locks.sh@158 -- # waitforlisten 3156436 /var/tmp/spdk.sock 00:08:12.740 13:20:31 -- common/autotest_common.sh@819 -- # '[' -z 3156436 ']' 00:08:12.740 13:20:31 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:12.740 13:20:31 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:12.740 13:20:31 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:12.740 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:12.740 13:20:31 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:12.740 13:20:31 -- common/autotest_common.sh@10 -- # set +x 00:08:12.999 13:20:31 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:12.999 13:20:31 -- common/autotest_common.sh@852 -- # return 0 00:08:12.999 13:20:31 -- event/cpu_locks.sh@159 -- # waitforlisten 3156451 /var/tmp/spdk2.sock 00:08:12.999 13:20:31 -- common/autotest_common.sh@819 -- # '[' -z 3156451 ']' 00:08:12.999 13:20:31 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:08:12.999 13:20:31 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:12.999 13:20:31 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:08:12.999 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:08:12.999 13:20:31 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:12.999 13:20:31 -- common/autotest_common.sh@10 -- # set +x 00:08:12.999 13:20:31 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:12.999 13:20:31 -- common/autotest_common.sh@852 -- # return 0 00:08:12.999 13:20:31 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:08:12.999 13:20:31 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:08:12.999 13:20:31 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:08:12.999 13:20:31 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:08:12.999 00:08:12.999 real 0m2.370s 00:08:12.999 user 0m1.041s 00:08:12.999 sys 0m0.253s 00:08:12.999 13:20:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:12.999 13:20:31 -- common/autotest_common.sh@10 -- # set +x 00:08:12.999 ************************************ 00:08:12.999 END TEST locking_overlapped_coremask_via_rpc 00:08:12.999 ************************************ 00:08:12.999 13:20:31 -- event/cpu_locks.sh@174 -- # cleanup 00:08:12.999 13:20:31 -- event/cpu_locks.sh@15 -- # [[ -z 3156436 ]] 00:08:12.999 13:20:31 -- event/cpu_locks.sh@15 -- # killprocess 3156436 00:08:12.999 13:20:31 -- common/autotest_common.sh@926 -- # '[' -z 3156436 ']' 00:08:12.999 13:20:31 -- common/autotest_common.sh@930 -- # kill -0 3156436 00:08:12.999 13:20:31 -- common/autotest_common.sh@931 -- # uname 00:08:12.999 13:20:31 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:08:12.999 13:20:31 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3156436 00:08:13.258 13:20:31 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:08:13.258 13:20:31 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:08:13.258 13:20:31 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3156436' 00:08:13.258 killing process with pid 3156436 00:08:13.258 13:20:31 -- common/autotest_common.sh@945 -- # kill 3156436 00:08:13.258 13:20:31 -- common/autotest_common.sh@950 -- # wait 3156436 00:08:13.517 13:20:32 -- event/cpu_locks.sh@16 -- # [[ -z 3156451 ]] 00:08:13.517 13:20:32 -- event/cpu_locks.sh@16 -- # killprocess 3156451 00:08:13.517 13:20:32 -- common/autotest_common.sh@926 -- # '[' -z 3156451 ']' 00:08:13.517 13:20:32 -- common/autotest_common.sh@930 -- # kill -0 3156451 00:08:13.517 13:20:32 -- common/autotest_common.sh@931 -- # uname 00:08:13.517 13:20:32 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:08:13.517 13:20:32 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3156451 00:08:13.517 13:20:32 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:08:13.517 13:20:32 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:08:13.517 13:20:32 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3156451' 00:08:13.517 killing process with pid 3156451 00:08:13.517 13:20:32 -- common/autotest_common.sh@945 -- # kill 3156451 00:08:13.517 13:20:32 -- common/autotest_common.sh@950 -- # wait 3156451 00:08:13.776 13:20:32 -- event/cpu_locks.sh@18 -- # rm -f 00:08:13.776 13:20:32 -- event/cpu_locks.sh@1 -- # cleanup 00:08:13.776 13:20:32 -- event/cpu_locks.sh@15 -- # [[ -z 3156436 ]] 00:08:13.776 13:20:32 -- event/cpu_locks.sh@15 -- # killprocess 3156436 00:08:13.776 13:20:32 -- common/autotest_common.sh@926 -- # '[' -z 3156436 ']' 00:08:13.776 13:20:32 -- common/autotest_common.sh@930 -- # kill -0 3156436 00:08:13.776 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (3156436) - No such process 00:08:13.776 13:20:32 -- common/autotest_common.sh@953 -- # echo 'Process with pid 3156436 is not found' 00:08:13.776 Process with pid 3156436 is not found 00:08:13.776 13:20:32 -- event/cpu_locks.sh@16 -- # [[ -z 3156451 ]] 00:08:13.776 13:20:32 -- event/cpu_locks.sh@16 -- # killprocess 3156451 00:08:13.776 13:20:32 -- common/autotest_common.sh@926 -- # '[' -z 3156451 ']' 00:08:13.777 13:20:32 -- common/autotest_common.sh@930 -- # kill -0 3156451 00:08:13.777 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (3156451) - No such process 00:08:13.777 13:20:32 -- common/autotest_common.sh@953 -- # echo 'Process with pid 3156451 is not found' 00:08:13.777 Process with pid 3156451 is not found 00:08:13.777 13:20:32 -- event/cpu_locks.sh@18 -- # rm -f 00:08:13.777 00:08:13.777 real 0m20.372s 00:08:13.777 user 0m34.379s 00:08:13.777 sys 0m6.963s 00:08:13.777 13:20:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:13.777 13:20:32 -- common/autotest_common.sh@10 -- # set +x 00:08:13.777 ************************************ 00:08:13.777 END TEST cpu_locks 00:08:13.777 ************************************ 00:08:14.036 00:08:14.036 real 0m46.433s 00:08:14.036 user 1m26.008s 00:08:14.036 sys 0m11.905s 00:08:14.036 13:20:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:14.036 13:20:32 -- common/autotest_common.sh@10 -- # set +x 00:08:14.036 ************************************ 00:08:14.036 END TEST event 00:08:14.036 ************************************ 00:08:14.036 13:20:32 -- spdk/autotest.sh@188 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:08:14.036 13:20:32 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:14.036 13:20:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:14.036 13:20:32 -- common/autotest_common.sh@10 -- # set +x 00:08:14.036 ************************************ 00:08:14.036 START TEST thread 00:08:14.036 ************************************ 00:08:14.036 13:20:32 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:08:14.036 * Looking for test storage... 00:08:14.036 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:08:14.036 13:20:32 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:08:14.036 13:20:32 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:08:14.036 13:20:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:14.036 13:20:32 -- common/autotest_common.sh@10 -- # set +x 00:08:14.036 ************************************ 00:08:14.036 START TEST thread_poller_perf 00:08:14.036 ************************************ 00:08:14.036 13:20:32 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:08:14.036 [2024-07-24 13:20:32.840748] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:14.036 [2024-07-24 13:20:32.840852] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3156906 ] 00:08:14.036 EAL: No free 2048 kB hugepages reported on node 1 00:08:14.295 [2024-07-24 13:20:32.961890] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:14.295 [2024-07-24 13:20:33.009470] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.295 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:08:15.233 ====================================== 00:08:15.233 busy:2306222236 (cyc) 00:08:15.233 total_run_count: 504000 00:08:15.233 tsc_hz: 2300000000 (cyc) 00:08:15.233 ====================================== 00:08:15.233 poller_cost: 4575 (cyc), 1989 (nsec) 00:08:15.233 00:08:15.233 real 0m1.260s 00:08:15.233 user 0m1.122s 00:08:15.233 sys 0m0.130s 00:08:15.233 13:20:34 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:15.233 13:20:34 -- common/autotest_common.sh@10 -- # set +x 00:08:15.233 ************************************ 00:08:15.233 END TEST thread_poller_perf 00:08:15.233 ************************************ 00:08:15.492 13:20:34 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:08:15.492 13:20:34 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:08:15.492 13:20:34 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:15.492 13:20:34 -- common/autotest_common.sh@10 -- # set +x 00:08:15.492 ************************************ 00:08:15.492 START TEST thread_poller_perf 00:08:15.492 ************************************ 00:08:15.492 13:20:34 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:08:15.492 [2024-07-24 13:20:34.152315] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:15.492 [2024-07-24 13:20:34.152413] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3157102 ] 00:08:15.492 EAL: No free 2048 kB hugepages reported on node 1 00:08:15.492 [2024-07-24 13:20:34.274935] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.492 [2024-07-24 13:20:34.325767] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.492 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:08:16.868 ====================================== 00:08:16.868 busy:2302528620 (cyc) 00:08:16.868 total_run_count: 8553000 00:08:16.868 tsc_hz: 2300000000 (cyc) 00:08:16.868 ====================================== 00:08:16.868 poller_cost: 269 (cyc), 116 (nsec) 00:08:16.868 00:08:16.868 real 0m1.262s 00:08:16.868 user 0m1.112s 00:08:16.868 sys 0m0.143s 00:08:16.868 13:20:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:16.868 13:20:35 -- common/autotest_common.sh@10 -- # set +x 00:08:16.868 ************************************ 00:08:16.868 END TEST thread_poller_perf 00:08:16.868 ************************************ 00:08:16.868 13:20:35 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:08:16.868 13:20:35 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:08:16.868 13:20:35 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:16.868 13:20:35 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:16.868 13:20:35 -- common/autotest_common.sh@10 -- # set +x 00:08:16.868 ************************************ 00:08:16.868 START TEST thread_spdk_lock 00:08:16.868 ************************************ 00:08:16.868 13:20:35 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:08:16.868 [2024-07-24 13:20:35.465740] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:16.868 [2024-07-24 13:20:35.465855] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3157296 ] 00:08:16.868 EAL: No free 2048 kB hugepages reported on node 1 00:08:16.868 [2024-07-24 13:20:35.588603] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:16.868 [2024-07-24 13:20:35.640026] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:16.868 [2024-07-24 13:20:35.640031] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:17.436 [2024-07-24 13:20:36.150460] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 955:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:08:17.436 [2024-07-24 13:20:36.150519] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3062:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:08:17.436 [2024-07-24 13:20:36.150537] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3017:sspin_stacks_print: *ERROR*: spinlock 0x133de80 00:08:17.436 [2024-07-24 13:20:36.151473] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:08:17.436 [2024-07-24 13:20:36.151577] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1016:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:08:17.436 [2024-07-24 13:20:36.151604] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:08:17.436 Starting test contend 00:08:17.436 Worker Delay Wait us Hold us Total us 00:08:17.436 0 3 155764 193736 349500 00:08:17.436 1 5 81203 292765 373969 00:08:17.436 PASS test contend 00:08:17.436 Starting test hold_by_poller 00:08:17.436 PASS test hold_by_poller 00:08:17.436 Starting test hold_by_message 00:08:17.436 PASS test hold_by_message 00:08:17.436 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:08:17.436 100014 assertions passed 00:08:17.436 0 assertions failed 00:08:17.437 00:08:17.437 real 0m0.772s 00:08:17.437 user 0m1.142s 00:08:17.437 sys 0m0.135s 00:08:17.437 13:20:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:17.437 13:20:36 -- common/autotest_common.sh@10 -- # set +x 00:08:17.437 ************************************ 00:08:17.437 END TEST thread_spdk_lock 00:08:17.437 ************************************ 00:08:17.437 00:08:17.437 real 0m3.551s 00:08:17.437 user 0m3.472s 00:08:17.437 sys 0m0.607s 00:08:17.437 13:20:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:17.437 13:20:36 -- common/autotest_common.sh@10 -- # set +x 00:08:17.437 ************************************ 00:08:17.437 END TEST thread 00:08:17.437 ************************************ 00:08:17.696 13:20:36 -- spdk/autotest.sh@189 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:08:17.696 13:20:36 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:17.696 13:20:36 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:17.696 13:20:36 -- common/autotest_common.sh@10 -- # set +x 00:08:17.696 ************************************ 00:08:17.696 START TEST accel 00:08:17.696 ************************************ 00:08:17.696 13:20:36 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:08:17.696 * Looking for test storage... 00:08:17.696 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:08:17.696 13:20:36 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:08:17.696 13:20:36 -- accel/accel.sh@74 -- # get_expected_opcs 00:08:17.696 13:20:36 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:17.696 13:20:36 -- accel/accel.sh@59 -- # spdk_tgt_pid=3157528 00:08:17.696 13:20:36 -- accel/accel.sh@60 -- # waitforlisten 3157528 00:08:17.696 13:20:36 -- common/autotest_common.sh@819 -- # '[' -z 3157528 ']' 00:08:17.696 13:20:36 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:17.696 13:20:36 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:17.696 13:20:36 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:17.696 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:17.696 13:20:36 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:17.696 13:20:36 -- common/autotest_common.sh@10 -- # set +x 00:08:17.696 13:20:36 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:08:17.696 13:20:36 -- accel/accel.sh@58 -- # build_accel_config 00:08:17.696 13:20:36 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:17.696 13:20:36 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:17.696 13:20:36 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:17.696 13:20:36 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:17.696 13:20:36 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:17.696 13:20:36 -- accel/accel.sh@41 -- # local IFS=, 00:08:17.696 13:20:36 -- accel/accel.sh@42 -- # jq -r . 00:08:17.696 [2024-07-24 13:20:36.417538] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:17.696 [2024-07-24 13:20:36.417632] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3157528 ] 00:08:17.696 EAL: No free 2048 kB hugepages reported on node 1 00:08:17.696 [2024-07-24 13:20:36.527192] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:17.955 [2024-07-24 13:20:36.571836] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:17.955 [2024-07-24 13:20:36.571985] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.521 13:20:37 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:18.521 13:20:37 -- common/autotest_common.sh@852 -- # return 0 00:08:18.522 13:20:37 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:08:18.522 13:20:37 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:08:18.522 13:20:37 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:08:18.522 13:20:37 -- common/autotest_common.sh@551 -- # xtrace_disable 00:08:18.522 13:20:37 -- common/autotest_common.sh@10 -- # set +x 00:08:18.522 13:20:37 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:08:18.781 13:20:37 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # IFS== 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # read -r opc module 00:08:18.781 13:20:37 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:18.781 13:20:37 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # IFS== 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # read -r opc module 00:08:18.781 13:20:37 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:18.781 13:20:37 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # IFS== 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # read -r opc module 00:08:18.781 13:20:37 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:18.781 13:20:37 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # IFS== 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # read -r opc module 00:08:18.781 13:20:37 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:18.781 13:20:37 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # IFS== 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # read -r opc module 00:08:18.781 13:20:37 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:18.781 13:20:37 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # IFS== 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # read -r opc module 00:08:18.781 13:20:37 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:18.781 13:20:37 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # IFS== 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # read -r opc module 00:08:18.781 13:20:37 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:18.781 13:20:37 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # IFS== 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # read -r opc module 00:08:18.781 13:20:37 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:18.781 13:20:37 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # IFS== 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # read -r opc module 00:08:18.781 13:20:37 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:18.781 13:20:37 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # IFS== 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # read -r opc module 00:08:18.781 13:20:37 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:18.781 13:20:37 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # IFS== 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # read -r opc module 00:08:18.781 13:20:37 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:18.781 13:20:37 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # IFS== 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # read -r opc module 00:08:18.781 13:20:37 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:18.781 13:20:37 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # IFS== 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # read -r opc module 00:08:18.781 13:20:37 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:18.781 13:20:37 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # IFS== 00:08:18.781 13:20:37 -- accel/accel.sh@64 -- # read -r opc module 00:08:18.781 13:20:37 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:18.781 13:20:37 -- accel/accel.sh@67 -- # killprocess 3157528 00:08:18.781 13:20:37 -- common/autotest_common.sh@926 -- # '[' -z 3157528 ']' 00:08:18.781 13:20:37 -- common/autotest_common.sh@930 -- # kill -0 3157528 00:08:18.781 13:20:37 -- common/autotest_common.sh@931 -- # uname 00:08:18.781 13:20:37 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:08:18.781 13:20:37 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3157528 00:08:18.781 13:20:37 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:08:18.781 13:20:37 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:08:18.781 13:20:37 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3157528' 00:08:18.781 killing process with pid 3157528 00:08:18.781 13:20:37 -- common/autotest_common.sh@945 -- # kill 3157528 00:08:18.781 13:20:37 -- common/autotest_common.sh@950 -- # wait 3157528 00:08:19.040 13:20:37 -- accel/accel.sh@68 -- # trap - ERR 00:08:19.040 13:20:37 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:08:19.040 13:20:37 -- common/autotest_common.sh@1077 -- # '[' 3 -le 1 ']' 00:08:19.040 13:20:37 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:19.040 13:20:37 -- common/autotest_common.sh@10 -- # set +x 00:08:19.040 13:20:37 -- common/autotest_common.sh@1104 -- # accel_perf -h 00:08:19.040 13:20:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:08:19.040 13:20:37 -- accel/accel.sh@12 -- # build_accel_config 00:08:19.040 13:20:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:19.040 13:20:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:19.040 13:20:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:19.040 13:20:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:19.040 13:20:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:19.040 13:20:37 -- accel/accel.sh@41 -- # local IFS=, 00:08:19.040 13:20:37 -- accel/accel.sh@42 -- # jq -r . 00:08:19.040 13:20:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:19.040 13:20:37 -- common/autotest_common.sh@10 -- # set +x 00:08:19.040 13:20:37 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:08:19.040 13:20:37 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:08:19.040 13:20:37 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:19.040 13:20:37 -- common/autotest_common.sh@10 -- # set +x 00:08:19.040 ************************************ 00:08:19.040 START TEST accel_missing_filename 00:08:19.040 ************************************ 00:08:19.040 13:20:37 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress 00:08:19.040 13:20:37 -- common/autotest_common.sh@640 -- # local es=0 00:08:19.040 13:20:37 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress 00:08:19.040 13:20:37 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:08:19.040 13:20:37 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:19.040 13:20:37 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:08:19.040 13:20:37 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:19.040 13:20:37 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress 00:08:19.040 13:20:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:08:19.040 13:20:37 -- accel/accel.sh@12 -- # build_accel_config 00:08:19.040 13:20:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:19.040 13:20:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:19.040 13:20:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:19.040 13:20:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:19.040 13:20:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:19.040 13:20:37 -- accel/accel.sh@41 -- # local IFS=, 00:08:19.040 13:20:37 -- accel/accel.sh@42 -- # jq -r . 00:08:19.040 [2024-07-24 13:20:37.881060] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:19.040 [2024-07-24 13:20:37.881181] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3157749 ] 00:08:19.299 EAL: No free 2048 kB hugepages reported on node 1 00:08:19.299 [2024-07-24 13:20:37.999720] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:19.299 [2024-07-24 13:20:38.047598] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:19.299 [2024-07-24 13:20:38.097128] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:19.558 [2024-07-24 13:20:38.169380] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:08:19.558 A filename is required. 00:08:19.558 13:20:38 -- common/autotest_common.sh@643 -- # es=234 00:08:19.558 13:20:38 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:08:19.558 13:20:38 -- common/autotest_common.sh@652 -- # es=106 00:08:19.558 13:20:38 -- common/autotest_common.sh@653 -- # case "$es" in 00:08:19.558 13:20:38 -- common/autotest_common.sh@660 -- # es=1 00:08:19.558 13:20:38 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:08:19.558 00:08:19.558 real 0m0.380s 00:08:19.558 user 0m0.237s 00:08:19.558 sys 0m0.185s 00:08:19.558 13:20:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:19.558 13:20:38 -- common/autotest_common.sh@10 -- # set +x 00:08:19.558 ************************************ 00:08:19.558 END TEST accel_missing_filename 00:08:19.558 ************************************ 00:08:19.558 13:20:38 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:19.558 13:20:38 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:08:19.558 13:20:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:19.559 13:20:38 -- common/autotest_common.sh@10 -- # set +x 00:08:19.559 ************************************ 00:08:19.559 START TEST accel_compress_verify 00:08:19.559 ************************************ 00:08:19.559 13:20:38 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:19.559 13:20:38 -- common/autotest_common.sh@640 -- # local es=0 00:08:19.559 13:20:38 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:19.559 13:20:38 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:08:19.559 13:20:38 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:19.559 13:20:38 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:08:19.559 13:20:38 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:19.559 13:20:38 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:19.559 13:20:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:19.559 13:20:38 -- accel/accel.sh@12 -- # build_accel_config 00:08:19.559 13:20:38 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:19.559 13:20:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:19.559 13:20:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:19.559 13:20:38 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:19.559 13:20:38 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:19.559 13:20:38 -- accel/accel.sh@41 -- # local IFS=, 00:08:19.559 13:20:38 -- accel/accel.sh@42 -- # jq -r . 00:08:19.559 [2024-07-24 13:20:38.301179] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:19.559 [2024-07-24 13:20:38.301278] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3157769 ] 00:08:19.559 EAL: No free 2048 kB hugepages reported on node 1 00:08:19.559 [2024-07-24 13:20:38.408834] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:19.817 [2024-07-24 13:20:38.456535] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:19.817 [2024-07-24 13:20:38.506028] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:19.817 [2024-07-24 13:20:38.578540] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:08:19.817 00:08:19.817 Compression does not support the verify option, aborting. 00:08:19.817 13:20:38 -- common/autotest_common.sh@643 -- # es=161 00:08:19.817 13:20:38 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:08:19.817 13:20:38 -- common/autotest_common.sh@652 -- # es=33 00:08:19.817 13:20:38 -- common/autotest_common.sh@653 -- # case "$es" in 00:08:19.817 13:20:38 -- common/autotest_common.sh@660 -- # es=1 00:08:19.817 13:20:38 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:08:19.817 00:08:19.817 real 0m0.370s 00:08:19.817 user 0m0.236s 00:08:19.817 sys 0m0.173s 00:08:19.817 13:20:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:19.817 13:20:38 -- common/autotest_common.sh@10 -- # set +x 00:08:19.817 ************************************ 00:08:19.817 END TEST accel_compress_verify 00:08:19.817 ************************************ 00:08:20.076 13:20:38 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:08:20.076 13:20:38 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:08:20.076 13:20:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:20.076 13:20:38 -- common/autotest_common.sh@10 -- # set +x 00:08:20.076 ************************************ 00:08:20.076 START TEST accel_wrong_workload 00:08:20.076 ************************************ 00:08:20.076 13:20:38 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w foobar 00:08:20.076 13:20:38 -- common/autotest_common.sh@640 -- # local es=0 00:08:20.076 13:20:38 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:08:20.076 13:20:38 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:08:20.076 13:20:38 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:20.076 13:20:38 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:08:20.076 13:20:38 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:20.076 13:20:38 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w foobar 00:08:20.076 13:20:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:08:20.076 13:20:38 -- accel/accel.sh@12 -- # build_accel_config 00:08:20.076 13:20:38 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:20.076 13:20:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:20.076 13:20:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:20.076 13:20:38 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:20.076 13:20:38 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:20.076 13:20:38 -- accel/accel.sh@41 -- # local IFS=, 00:08:20.076 13:20:38 -- accel/accel.sh@42 -- # jq -r . 00:08:20.076 Unsupported workload type: foobar 00:08:20.076 [2024-07-24 13:20:38.712305] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:08:20.076 accel_perf options: 00:08:20.076 [-h help message] 00:08:20.076 [-q queue depth per core] 00:08:20.076 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:08:20.076 [-T number of threads per core 00:08:20.076 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:08:20.076 [-t time in seconds] 00:08:20.076 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:08:20.076 [ dif_verify, , dif_generate, dif_generate_copy 00:08:20.076 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:08:20.076 [-l for compress/decompress workloads, name of uncompressed input file 00:08:20.076 [-S for crc32c workload, use this seed value (default 0) 00:08:20.076 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:08:20.076 [-f for fill workload, use this BYTE value (default 255) 00:08:20.076 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:08:20.076 [-y verify result if this switch is on] 00:08:20.076 [-a tasks to allocate per core (default: same value as -q)] 00:08:20.076 Can be used to spread operations across a wider range of memory. 00:08:20.076 13:20:38 -- common/autotest_common.sh@643 -- # es=1 00:08:20.076 13:20:38 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:08:20.076 13:20:38 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:08:20.076 13:20:38 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:08:20.076 00:08:20.076 real 0m0.026s 00:08:20.076 user 0m0.011s 00:08:20.076 sys 0m0.015s 00:08:20.076 13:20:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:20.076 13:20:38 -- common/autotest_common.sh@10 -- # set +x 00:08:20.076 ************************************ 00:08:20.077 END TEST accel_wrong_workload 00:08:20.077 ************************************ 00:08:20.077 Error: writing output failed: Broken pipe 00:08:20.077 13:20:38 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:08:20.077 13:20:38 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:08:20.077 13:20:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:20.077 13:20:38 -- common/autotest_common.sh@10 -- # set +x 00:08:20.077 ************************************ 00:08:20.077 START TEST accel_negative_buffers 00:08:20.077 ************************************ 00:08:20.077 13:20:38 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:08:20.077 13:20:38 -- common/autotest_common.sh@640 -- # local es=0 00:08:20.077 13:20:38 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:08:20.077 13:20:38 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:08:20.077 13:20:38 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:20.077 13:20:38 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:08:20.077 13:20:38 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:20.077 13:20:38 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w xor -y -x -1 00:08:20.077 13:20:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:08:20.077 13:20:38 -- accel/accel.sh@12 -- # build_accel_config 00:08:20.077 13:20:38 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:20.077 13:20:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:20.077 13:20:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:20.077 13:20:38 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:20.077 13:20:38 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:20.077 13:20:38 -- accel/accel.sh@41 -- # local IFS=, 00:08:20.077 13:20:38 -- accel/accel.sh@42 -- # jq -r . 00:08:20.077 -x option must be non-negative. 00:08:20.077 [2024-07-24 13:20:38.781030] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:08:20.077 accel_perf options: 00:08:20.077 [-h help message] 00:08:20.077 [-q queue depth per core] 00:08:20.077 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:08:20.077 [-T number of threads per core 00:08:20.077 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:08:20.077 [-t time in seconds] 00:08:20.077 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:08:20.077 [ dif_verify, , dif_generate, dif_generate_copy 00:08:20.077 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:08:20.077 [-l for compress/decompress workloads, name of uncompressed input file 00:08:20.077 [-S for crc32c workload, use this seed value (default 0) 00:08:20.077 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:08:20.077 [-f for fill workload, use this BYTE value (default 255) 00:08:20.077 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:08:20.077 [-y verify result if this switch is on] 00:08:20.077 [-a tasks to allocate per core (default: same value as -q)] 00:08:20.077 Can be used to spread operations across a wider range of memory. 00:08:20.077 13:20:38 -- common/autotest_common.sh@643 -- # es=1 00:08:20.077 13:20:38 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:08:20.077 13:20:38 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:08:20.077 Error: writing output failed: Broken pipe 00:08:20.077 13:20:38 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:08:20.077 00:08:20.077 real 0m0.026s 00:08:20.077 user 0m0.011s 00:08:20.077 sys 0m0.015s 00:08:20.077 13:20:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:20.077 13:20:38 -- common/autotest_common.sh@10 -- # set +x 00:08:20.077 ************************************ 00:08:20.077 END TEST accel_negative_buffers 00:08:20.077 ************************************ 00:08:20.077 13:20:38 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:08:20.077 13:20:38 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:08:20.077 13:20:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:20.077 13:20:38 -- common/autotest_common.sh@10 -- # set +x 00:08:20.077 ************************************ 00:08:20.077 START TEST accel_crc32c 00:08:20.077 ************************************ 00:08:20.077 13:20:38 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -S 32 -y 00:08:20.077 13:20:38 -- accel/accel.sh@16 -- # local accel_opc 00:08:20.077 13:20:38 -- accel/accel.sh@17 -- # local accel_module 00:08:20.077 13:20:38 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:08:20.077 13:20:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:08:20.077 13:20:38 -- accel/accel.sh@12 -- # build_accel_config 00:08:20.077 13:20:38 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:20.077 13:20:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:20.077 13:20:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:20.077 13:20:38 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:20.077 13:20:38 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:20.077 13:20:38 -- accel/accel.sh@41 -- # local IFS=, 00:08:20.077 13:20:38 -- accel/accel.sh@42 -- # jq -r . 00:08:20.077 [2024-07-24 13:20:38.852981] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:20.077 [2024-07-24 13:20:38.853061] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3157959 ] 00:08:20.077 EAL: No free 2048 kB hugepages reported on node 1 00:08:20.336 [2024-07-24 13:20:38.973980] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.336 [2024-07-24 13:20:39.018583] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.713 13:20:40 -- accel/accel.sh@18 -- # out=' 00:08:21.713 SPDK Configuration: 00:08:21.713 Core mask: 0x1 00:08:21.713 00:08:21.713 Accel Perf Configuration: 00:08:21.713 Workload Type: crc32c 00:08:21.713 CRC-32C seed: 32 00:08:21.713 Transfer size: 4096 bytes 00:08:21.713 Vector count 1 00:08:21.713 Module: software 00:08:21.713 Queue depth: 32 00:08:21.713 Allocate depth: 32 00:08:21.713 # threads/core: 1 00:08:21.713 Run time: 1 seconds 00:08:21.713 Verify: Yes 00:08:21.713 00:08:21.713 Running for 1 seconds... 00:08:21.713 00:08:21.713 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:21.713 ------------------------------------------------------------------------------------ 00:08:21.713 0,0 526560/s 2056 MiB/s 0 0 00:08:21.713 ==================================================================================== 00:08:21.713 Total 526560/s 2056 MiB/s 0 0' 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # IFS=: 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # read -r var val 00:08:21.713 13:20:40 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:08:21.713 13:20:40 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:08:21.713 13:20:40 -- accel/accel.sh@12 -- # build_accel_config 00:08:21.713 13:20:40 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:21.713 13:20:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:21.713 13:20:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:21.713 13:20:40 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:21.713 13:20:40 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:21.713 13:20:40 -- accel/accel.sh@41 -- # local IFS=, 00:08:21.713 13:20:40 -- accel/accel.sh@42 -- # jq -r . 00:08:21.713 [2024-07-24 13:20:40.228583] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:21.713 [2024-07-24 13:20:40.228682] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3158153 ] 00:08:21.713 EAL: No free 2048 kB hugepages reported on node 1 00:08:21.713 [2024-07-24 13:20:40.348573] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.713 [2024-07-24 13:20:40.391815] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.713 13:20:40 -- accel/accel.sh@21 -- # val= 00:08:21.713 13:20:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # IFS=: 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # read -r var val 00:08:21.713 13:20:40 -- accel/accel.sh@21 -- # val= 00:08:21.713 13:20:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # IFS=: 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # read -r var val 00:08:21.713 13:20:40 -- accel/accel.sh@21 -- # val=0x1 00:08:21.713 13:20:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # IFS=: 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # read -r var val 00:08:21.713 13:20:40 -- accel/accel.sh@21 -- # val= 00:08:21.713 13:20:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # IFS=: 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # read -r var val 00:08:21.713 13:20:40 -- accel/accel.sh@21 -- # val= 00:08:21.713 13:20:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # IFS=: 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # read -r var val 00:08:21.713 13:20:40 -- accel/accel.sh@21 -- # val=crc32c 00:08:21.713 13:20:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.713 13:20:40 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # IFS=: 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # read -r var val 00:08:21.713 13:20:40 -- accel/accel.sh@21 -- # val=32 00:08:21.713 13:20:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # IFS=: 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # read -r var val 00:08:21.713 13:20:40 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:21.713 13:20:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # IFS=: 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # read -r var val 00:08:21.713 13:20:40 -- accel/accel.sh@21 -- # val= 00:08:21.713 13:20:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # IFS=: 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # read -r var val 00:08:21.713 13:20:40 -- accel/accel.sh@21 -- # val=software 00:08:21.713 13:20:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.713 13:20:40 -- accel/accel.sh@23 -- # accel_module=software 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # IFS=: 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # read -r var val 00:08:21.713 13:20:40 -- accel/accel.sh@21 -- # val=32 00:08:21.713 13:20:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # IFS=: 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # read -r var val 00:08:21.713 13:20:40 -- accel/accel.sh@21 -- # val=32 00:08:21.713 13:20:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # IFS=: 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # read -r var val 00:08:21.713 13:20:40 -- accel/accel.sh@21 -- # val=1 00:08:21.713 13:20:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # IFS=: 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # read -r var val 00:08:21.713 13:20:40 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:21.713 13:20:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # IFS=: 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # read -r var val 00:08:21.713 13:20:40 -- accel/accel.sh@21 -- # val=Yes 00:08:21.713 13:20:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # IFS=: 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # read -r var val 00:08:21.713 13:20:40 -- accel/accel.sh@21 -- # val= 00:08:21.713 13:20:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # IFS=: 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # read -r var val 00:08:21.713 13:20:40 -- accel/accel.sh@21 -- # val= 00:08:21.713 13:20:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # IFS=: 00:08:21.713 13:20:40 -- accel/accel.sh@20 -- # read -r var val 00:08:23.090 13:20:41 -- accel/accel.sh@21 -- # val= 00:08:23.090 13:20:41 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.090 13:20:41 -- accel/accel.sh@20 -- # IFS=: 00:08:23.090 13:20:41 -- accel/accel.sh@20 -- # read -r var val 00:08:23.090 13:20:41 -- accel/accel.sh@21 -- # val= 00:08:23.090 13:20:41 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.090 13:20:41 -- accel/accel.sh@20 -- # IFS=: 00:08:23.090 13:20:41 -- accel/accel.sh@20 -- # read -r var val 00:08:23.090 13:20:41 -- accel/accel.sh@21 -- # val= 00:08:23.090 13:20:41 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.090 13:20:41 -- accel/accel.sh@20 -- # IFS=: 00:08:23.090 13:20:41 -- accel/accel.sh@20 -- # read -r var val 00:08:23.090 13:20:41 -- accel/accel.sh@21 -- # val= 00:08:23.090 13:20:41 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.090 13:20:41 -- accel/accel.sh@20 -- # IFS=: 00:08:23.090 13:20:41 -- accel/accel.sh@20 -- # read -r var val 00:08:23.090 13:20:41 -- accel/accel.sh@21 -- # val= 00:08:23.090 13:20:41 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.090 13:20:41 -- accel/accel.sh@20 -- # IFS=: 00:08:23.090 13:20:41 -- accel/accel.sh@20 -- # read -r var val 00:08:23.090 13:20:41 -- accel/accel.sh@21 -- # val= 00:08:23.090 13:20:41 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.090 13:20:41 -- accel/accel.sh@20 -- # IFS=: 00:08:23.090 13:20:41 -- accel/accel.sh@20 -- # read -r var val 00:08:23.090 13:20:41 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:23.090 13:20:41 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:08:23.090 13:20:41 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:23.090 00:08:23.090 real 0m2.748s 00:08:23.090 user 0m2.407s 00:08:23.090 sys 0m0.339s 00:08:23.090 13:20:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:23.090 13:20:41 -- common/autotest_common.sh@10 -- # set +x 00:08:23.090 ************************************ 00:08:23.090 END TEST accel_crc32c 00:08:23.090 ************************************ 00:08:23.090 13:20:41 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:08:23.090 13:20:41 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:08:23.090 13:20:41 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:23.090 13:20:41 -- common/autotest_common.sh@10 -- # set +x 00:08:23.090 ************************************ 00:08:23.090 START TEST accel_crc32c_C2 00:08:23.090 ************************************ 00:08:23.090 13:20:41 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -y -C 2 00:08:23.090 13:20:41 -- accel/accel.sh@16 -- # local accel_opc 00:08:23.090 13:20:41 -- accel/accel.sh@17 -- # local accel_module 00:08:23.090 13:20:41 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:08:23.090 13:20:41 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:08:23.090 13:20:41 -- accel/accel.sh@12 -- # build_accel_config 00:08:23.090 13:20:41 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:23.090 13:20:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:23.090 13:20:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:23.090 13:20:41 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:23.090 13:20:41 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:23.090 13:20:41 -- accel/accel.sh@41 -- # local IFS=, 00:08:23.090 13:20:41 -- accel/accel.sh@42 -- # jq -r . 00:08:23.090 [2024-07-24 13:20:41.642574] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:23.090 [2024-07-24 13:20:41.642666] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3158369 ] 00:08:23.090 EAL: No free 2048 kB hugepages reported on node 1 00:08:23.090 [2024-07-24 13:20:41.761185] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.090 [2024-07-24 13:20:41.806006] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.505 13:20:42 -- accel/accel.sh@18 -- # out=' 00:08:24.505 SPDK Configuration: 00:08:24.505 Core mask: 0x1 00:08:24.505 00:08:24.505 Accel Perf Configuration: 00:08:24.505 Workload Type: crc32c 00:08:24.505 CRC-32C seed: 0 00:08:24.505 Transfer size: 4096 bytes 00:08:24.505 Vector count 2 00:08:24.505 Module: software 00:08:24.505 Queue depth: 32 00:08:24.505 Allocate depth: 32 00:08:24.505 # threads/core: 1 00:08:24.505 Run time: 1 seconds 00:08:24.505 Verify: Yes 00:08:24.505 00:08:24.505 Running for 1 seconds... 00:08:24.505 00:08:24.505 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:24.505 ------------------------------------------------------------------------------------ 00:08:24.505 0,0 379424/s 2964 MiB/s 0 0 00:08:24.505 ==================================================================================== 00:08:24.505 Total 379424/s 1482 MiB/s 0 0' 00:08:24.505 13:20:42 -- accel/accel.sh@20 -- # IFS=: 00:08:24.505 13:20:42 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:08:24.505 13:20:42 -- accel/accel.sh@20 -- # read -r var val 00:08:24.505 13:20:42 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:08:24.505 13:20:42 -- accel/accel.sh@12 -- # build_accel_config 00:08:24.505 13:20:42 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:24.505 13:20:42 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:24.505 13:20:42 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:24.505 13:20:42 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:24.505 13:20:42 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:24.505 13:20:42 -- accel/accel.sh@41 -- # local IFS=, 00:08:24.505 13:20:42 -- accel/accel.sh@42 -- # jq -r . 00:08:24.505 [2024-07-24 13:20:42.994970] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:24.505 [2024-07-24 13:20:42.995028] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3158555 ] 00:08:24.505 EAL: No free 2048 kB hugepages reported on node 1 00:08:24.505 [2024-07-24 13:20:43.090924] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:24.505 [2024-07-24 13:20:43.136935] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.505 13:20:43 -- accel/accel.sh@21 -- # val= 00:08:24.505 13:20:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.505 13:20:43 -- accel/accel.sh@20 -- # IFS=: 00:08:24.505 13:20:43 -- accel/accel.sh@20 -- # read -r var val 00:08:24.505 13:20:43 -- accel/accel.sh@21 -- # val= 00:08:24.505 13:20:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.505 13:20:43 -- accel/accel.sh@20 -- # IFS=: 00:08:24.505 13:20:43 -- accel/accel.sh@20 -- # read -r var val 00:08:24.505 13:20:43 -- accel/accel.sh@21 -- # val=0x1 00:08:24.505 13:20:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.505 13:20:43 -- accel/accel.sh@20 -- # IFS=: 00:08:24.505 13:20:43 -- accel/accel.sh@20 -- # read -r var val 00:08:24.505 13:20:43 -- accel/accel.sh@21 -- # val= 00:08:24.505 13:20:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.505 13:20:43 -- accel/accel.sh@20 -- # IFS=: 00:08:24.505 13:20:43 -- accel/accel.sh@20 -- # read -r var val 00:08:24.505 13:20:43 -- accel/accel.sh@21 -- # val= 00:08:24.505 13:20:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.505 13:20:43 -- accel/accel.sh@20 -- # IFS=: 00:08:24.505 13:20:43 -- accel/accel.sh@20 -- # read -r var val 00:08:24.505 13:20:43 -- accel/accel.sh@21 -- # val=crc32c 00:08:24.505 13:20:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.505 13:20:43 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:08:24.505 13:20:43 -- accel/accel.sh@20 -- # IFS=: 00:08:24.505 13:20:43 -- accel/accel.sh@20 -- # read -r var val 00:08:24.506 13:20:43 -- accel/accel.sh@21 -- # val=0 00:08:24.506 13:20:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # IFS=: 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # read -r var val 00:08:24.506 13:20:43 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:24.506 13:20:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # IFS=: 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # read -r var val 00:08:24.506 13:20:43 -- accel/accel.sh@21 -- # val= 00:08:24.506 13:20:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # IFS=: 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # read -r var val 00:08:24.506 13:20:43 -- accel/accel.sh@21 -- # val=software 00:08:24.506 13:20:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.506 13:20:43 -- accel/accel.sh@23 -- # accel_module=software 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # IFS=: 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # read -r var val 00:08:24.506 13:20:43 -- accel/accel.sh@21 -- # val=32 00:08:24.506 13:20:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # IFS=: 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # read -r var val 00:08:24.506 13:20:43 -- accel/accel.sh@21 -- # val=32 00:08:24.506 13:20:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # IFS=: 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # read -r var val 00:08:24.506 13:20:43 -- accel/accel.sh@21 -- # val=1 00:08:24.506 13:20:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # IFS=: 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # read -r var val 00:08:24.506 13:20:43 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:24.506 13:20:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # IFS=: 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # read -r var val 00:08:24.506 13:20:43 -- accel/accel.sh@21 -- # val=Yes 00:08:24.506 13:20:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # IFS=: 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # read -r var val 00:08:24.506 13:20:43 -- accel/accel.sh@21 -- # val= 00:08:24.506 13:20:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # IFS=: 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # read -r var val 00:08:24.506 13:20:43 -- accel/accel.sh@21 -- # val= 00:08:24.506 13:20:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # IFS=: 00:08:24.506 13:20:43 -- accel/accel.sh@20 -- # read -r var val 00:08:25.882 13:20:44 -- accel/accel.sh@21 -- # val= 00:08:25.882 13:20:44 -- accel/accel.sh@22 -- # case "$var" in 00:08:25.882 13:20:44 -- accel/accel.sh@20 -- # IFS=: 00:08:25.882 13:20:44 -- accel/accel.sh@20 -- # read -r var val 00:08:25.883 13:20:44 -- accel/accel.sh@21 -- # val= 00:08:25.883 13:20:44 -- accel/accel.sh@22 -- # case "$var" in 00:08:25.883 13:20:44 -- accel/accel.sh@20 -- # IFS=: 00:08:25.883 13:20:44 -- accel/accel.sh@20 -- # read -r var val 00:08:25.883 13:20:44 -- accel/accel.sh@21 -- # val= 00:08:25.883 13:20:44 -- accel/accel.sh@22 -- # case "$var" in 00:08:25.883 13:20:44 -- accel/accel.sh@20 -- # IFS=: 00:08:25.883 13:20:44 -- accel/accel.sh@20 -- # read -r var val 00:08:25.883 13:20:44 -- accel/accel.sh@21 -- # val= 00:08:25.883 13:20:44 -- accel/accel.sh@22 -- # case "$var" in 00:08:25.883 13:20:44 -- accel/accel.sh@20 -- # IFS=: 00:08:25.883 13:20:44 -- accel/accel.sh@20 -- # read -r var val 00:08:25.883 13:20:44 -- accel/accel.sh@21 -- # val= 00:08:25.883 13:20:44 -- accel/accel.sh@22 -- # case "$var" in 00:08:25.883 13:20:44 -- accel/accel.sh@20 -- # IFS=: 00:08:25.883 13:20:44 -- accel/accel.sh@20 -- # read -r var val 00:08:25.883 13:20:44 -- accel/accel.sh@21 -- # val= 00:08:25.883 13:20:44 -- accel/accel.sh@22 -- # case "$var" in 00:08:25.883 13:20:44 -- accel/accel.sh@20 -- # IFS=: 00:08:25.883 13:20:44 -- accel/accel.sh@20 -- # read -r var val 00:08:25.883 13:20:44 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:25.883 13:20:44 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:08:25.883 13:20:44 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:25.883 00:08:25.883 real 0m2.698s 00:08:25.883 user 0m2.357s 00:08:25.883 sys 0m0.339s 00:08:25.883 13:20:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:25.883 13:20:44 -- common/autotest_common.sh@10 -- # set +x 00:08:25.883 ************************************ 00:08:25.883 END TEST accel_crc32c_C2 00:08:25.883 ************************************ 00:08:25.883 13:20:44 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:08:25.883 13:20:44 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:08:25.883 13:20:44 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:25.883 13:20:44 -- common/autotest_common.sh@10 -- # set +x 00:08:25.883 ************************************ 00:08:25.883 START TEST accel_copy 00:08:25.883 ************************************ 00:08:25.883 13:20:44 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy -y 00:08:25.883 13:20:44 -- accel/accel.sh@16 -- # local accel_opc 00:08:25.883 13:20:44 -- accel/accel.sh@17 -- # local accel_module 00:08:25.883 13:20:44 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:08:25.883 13:20:44 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:08:25.883 13:20:44 -- accel/accel.sh@12 -- # build_accel_config 00:08:25.883 13:20:44 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:25.883 13:20:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:25.883 13:20:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:25.883 13:20:44 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:25.883 13:20:44 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:25.883 13:20:44 -- accel/accel.sh@41 -- # local IFS=, 00:08:25.883 13:20:44 -- accel/accel.sh@42 -- # jq -r . 00:08:25.883 [2024-07-24 13:20:44.371687] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:25.883 [2024-07-24 13:20:44.371747] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3158752 ] 00:08:25.883 EAL: No free 2048 kB hugepages reported on node 1 00:08:25.883 [2024-07-24 13:20:44.473682] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.883 [2024-07-24 13:20:44.521507] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:27.261 13:20:45 -- accel/accel.sh@18 -- # out=' 00:08:27.261 SPDK Configuration: 00:08:27.261 Core mask: 0x1 00:08:27.261 00:08:27.261 Accel Perf Configuration: 00:08:27.261 Workload Type: copy 00:08:27.261 Transfer size: 4096 bytes 00:08:27.261 Vector count 1 00:08:27.261 Module: software 00:08:27.261 Queue depth: 32 00:08:27.261 Allocate depth: 32 00:08:27.261 # threads/core: 1 00:08:27.261 Run time: 1 seconds 00:08:27.261 Verify: Yes 00:08:27.261 00:08:27.261 Running for 1 seconds... 00:08:27.261 00:08:27.261 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:27.261 ------------------------------------------------------------------------------------ 00:08:27.261 0,0 337408/s 1318 MiB/s 0 0 00:08:27.261 ==================================================================================== 00:08:27.261 Total 337408/s 1318 MiB/s 0 0' 00:08:27.261 13:20:45 -- accel/accel.sh@20 -- # IFS=: 00:08:27.261 13:20:45 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:08:27.261 13:20:45 -- accel/accel.sh@20 -- # read -r var val 00:08:27.261 13:20:45 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:08:27.261 13:20:45 -- accel/accel.sh@12 -- # build_accel_config 00:08:27.261 13:20:45 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:27.261 13:20:45 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:27.261 13:20:45 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:27.261 13:20:45 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:27.261 13:20:45 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:27.261 13:20:45 -- accel/accel.sh@41 -- # local IFS=, 00:08:27.261 13:20:45 -- accel/accel.sh@42 -- # jq -r . 00:08:27.261 [2024-07-24 13:20:45.716359] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:27.261 [2024-07-24 13:20:45.716418] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3158932 ] 00:08:27.261 EAL: No free 2048 kB hugepages reported on node 1 00:08:27.261 [2024-07-24 13:20:45.816310] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:27.261 [2024-07-24 13:20:45.863265] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:27.261 13:20:45 -- accel/accel.sh@21 -- # val= 00:08:27.261 13:20:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.261 13:20:45 -- accel/accel.sh@20 -- # IFS=: 00:08:27.261 13:20:45 -- accel/accel.sh@20 -- # read -r var val 00:08:27.261 13:20:45 -- accel/accel.sh@21 -- # val= 00:08:27.261 13:20:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.261 13:20:45 -- accel/accel.sh@20 -- # IFS=: 00:08:27.261 13:20:45 -- accel/accel.sh@20 -- # read -r var val 00:08:27.261 13:20:45 -- accel/accel.sh@21 -- # val=0x1 00:08:27.261 13:20:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.261 13:20:45 -- accel/accel.sh@20 -- # IFS=: 00:08:27.261 13:20:45 -- accel/accel.sh@20 -- # read -r var val 00:08:27.261 13:20:45 -- accel/accel.sh@21 -- # val= 00:08:27.261 13:20:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.261 13:20:45 -- accel/accel.sh@20 -- # IFS=: 00:08:27.261 13:20:45 -- accel/accel.sh@20 -- # read -r var val 00:08:27.261 13:20:45 -- accel/accel.sh@21 -- # val= 00:08:27.261 13:20:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.261 13:20:45 -- accel/accel.sh@20 -- # IFS=: 00:08:27.261 13:20:45 -- accel/accel.sh@20 -- # read -r var val 00:08:27.261 13:20:45 -- accel/accel.sh@21 -- # val=copy 00:08:27.261 13:20:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.261 13:20:45 -- accel/accel.sh@24 -- # accel_opc=copy 00:08:27.261 13:20:45 -- accel/accel.sh@20 -- # IFS=: 00:08:27.261 13:20:45 -- accel/accel.sh@20 -- # read -r var val 00:08:27.261 13:20:45 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:27.261 13:20:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.261 13:20:45 -- accel/accel.sh@20 -- # IFS=: 00:08:27.261 13:20:45 -- accel/accel.sh@20 -- # read -r var val 00:08:27.261 13:20:45 -- accel/accel.sh@21 -- # val= 00:08:27.261 13:20:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.261 13:20:45 -- accel/accel.sh@20 -- # IFS=: 00:08:27.261 13:20:45 -- accel/accel.sh@20 -- # read -r var val 00:08:27.261 13:20:45 -- accel/accel.sh@21 -- # val=software 00:08:27.262 13:20:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.262 13:20:45 -- accel/accel.sh@23 -- # accel_module=software 00:08:27.262 13:20:45 -- accel/accel.sh@20 -- # IFS=: 00:08:27.262 13:20:45 -- accel/accel.sh@20 -- # read -r var val 00:08:27.262 13:20:45 -- accel/accel.sh@21 -- # val=32 00:08:27.262 13:20:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.262 13:20:45 -- accel/accel.sh@20 -- # IFS=: 00:08:27.262 13:20:45 -- accel/accel.sh@20 -- # read -r var val 00:08:27.262 13:20:45 -- accel/accel.sh@21 -- # val=32 00:08:27.262 13:20:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.262 13:20:45 -- accel/accel.sh@20 -- # IFS=: 00:08:27.262 13:20:45 -- accel/accel.sh@20 -- # read -r var val 00:08:27.262 13:20:45 -- accel/accel.sh@21 -- # val=1 00:08:27.262 13:20:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.262 13:20:45 -- accel/accel.sh@20 -- # IFS=: 00:08:27.262 13:20:45 -- accel/accel.sh@20 -- # read -r var val 00:08:27.262 13:20:45 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:27.262 13:20:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.262 13:20:45 -- accel/accel.sh@20 -- # IFS=: 00:08:27.262 13:20:45 -- accel/accel.sh@20 -- # read -r var val 00:08:27.262 13:20:45 -- accel/accel.sh@21 -- # val=Yes 00:08:27.262 13:20:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.262 13:20:45 -- accel/accel.sh@20 -- # IFS=: 00:08:27.262 13:20:45 -- accel/accel.sh@20 -- # read -r var val 00:08:27.262 13:20:45 -- accel/accel.sh@21 -- # val= 00:08:27.262 13:20:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.262 13:20:45 -- accel/accel.sh@20 -- # IFS=: 00:08:27.262 13:20:45 -- accel/accel.sh@20 -- # read -r var val 00:08:27.262 13:20:45 -- accel/accel.sh@21 -- # val= 00:08:27.262 13:20:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:27.262 13:20:45 -- accel/accel.sh@20 -- # IFS=: 00:08:27.262 13:20:45 -- accel/accel.sh@20 -- # read -r var val 00:08:28.197 13:20:47 -- accel/accel.sh@21 -- # val= 00:08:28.197 13:20:47 -- accel/accel.sh@22 -- # case "$var" in 00:08:28.197 13:20:47 -- accel/accel.sh@20 -- # IFS=: 00:08:28.197 13:20:47 -- accel/accel.sh@20 -- # read -r var val 00:08:28.197 13:20:47 -- accel/accel.sh@21 -- # val= 00:08:28.197 13:20:47 -- accel/accel.sh@22 -- # case "$var" in 00:08:28.197 13:20:47 -- accel/accel.sh@20 -- # IFS=: 00:08:28.197 13:20:47 -- accel/accel.sh@20 -- # read -r var val 00:08:28.197 13:20:47 -- accel/accel.sh@21 -- # val= 00:08:28.197 13:20:47 -- accel/accel.sh@22 -- # case "$var" in 00:08:28.197 13:20:47 -- accel/accel.sh@20 -- # IFS=: 00:08:28.197 13:20:47 -- accel/accel.sh@20 -- # read -r var val 00:08:28.198 13:20:47 -- accel/accel.sh@21 -- # val= 00:08:28.198 13:20:47 -- accel/accel.sh@22 -- # case "$var" in 00:08:28.198 13:20:47 -- accel/accel.sh@20 -- # IFS=: 00:08:28.198 13:20:47 -- accel/accel.sh@20 -- # read -r var val 00:08:28.198 13:20:47 -- accel/accel.sh@21 -- # val= 00:08:28.198 13:20:47 -- accel/accel.sh@22 -- # case "$var" in 00:08:28.198 13:20:47 -- accel/accel.sh@20 -- # IFS=: 00:08:28.198 13:20:47 -- accel/accel.sh@20 -- # read -r var val 00:08:28.198 13:20:47 -- accel/accel.sh@21 -- # val= 00:08:28.198 13:20:47 -- accel/accel.sh@22 -- # case "$var" in 00:08:28.198 13:20:47 -- accel/accel.sh@20 -- # IFS=: 00:08:28.198 13:20:47 -- accel/accel.sh@20 -- # read -r var val 00:08:28.198 13:20:47 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:28.198 13:20:47 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:08:28.198 13:20:47 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:28.198 00:08:28.198 real 0m2.695s 00:08:28.198 user 0m2.362s 00:08:28.198 sys 0m0.330s 00:08:28.198 13:20:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:28.198 13:20:47 -- common/autotest_common.sh@10 -- # set +x 00:08:28.198 ************************************ 00:08:28.198 END TEST accel_copy 00:08:28.198 ************************************ 00:08:28.520 13:20:47 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:28.520 13:20:47 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:08:28.520 13:20:47 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:28.520 13:20:47 -- common/autotest_common.sh@10 -- # set +x 00:08:28.520 ************************************ 00:08:28.520 START TEST accel_fill 00:08:28.520 ************************************ 00:08:28.520 13:20:47 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:28.520 13:20:47 -- accel/accel.sh@16 -- # local accel_opc 00:08:28.520 13:20:47 -- accel/accel.sh@17 -- # local accel_module 00:08:28.520 13:20:47 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:28.520 13:20:47 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:28.520 13:20:47 -- accel/accel.sh@12 -- # build_accel_config 00:08:28.520 13:20:47 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:28.520 13:20:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:28.520 13:20:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:28.520 13:20:47 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:28.520 13:20:47 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:28.520 13:20:47 -- accel/accel.sh@41 -- # local IFS=, 00:08:28.520 13:20:47 -- accel/accel.sh@42 -- # jq -r . 00:08:28.520 [2024-07-24 13:20:47.122407] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:28.520 [2024-07-24 13:20:47.122517] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3159131 ] 00:08:28.520 EAL: No free 2048 kB hugepages reported on node 1 00:08:28.520 [2024-07-24 13:20:47.239385] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.520 [2024-07-24 13:20:47.286920] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.904 13:20:48 -- accel/accel.sh@18 -- # out=' 00:08:29.904 SPDK Configuration: 00:08:29.904 Core mask: 0x1 00:08:29.904 00:08:29.904 Accel Perf Configuration: 00:08:29.904 Workload Type: fill 00:08:29.904 Fill pattern: 0x80 00:08:29.904 Transfer size: 4096 bytes 00:08:29.904 Vector count 1 00:08:29.904 Module: software 00:08:29.904 Queue depth: 64 00:08:29.904 Allocate depth: 64 00:08:29.904 # threads/core: 1 00:08:29.904 Run time: 1 seconds 00:08:29.904 Verify: Yes 00:08:29.904 00:08:29.904 Running for 1 seconds... 00:08:29.904 00:08:29.904 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:29.904 ------------------------------------------------------------------------------------ 00:08:29.904 0,0 583808/s 2280 MiB/s 0 0 00:08:29.904 ==================================================================================== 00:08:29.904 Total 583808/s 2280 MiB/s 0 0' 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # IFS=: 00:08:29.904 13:20:48 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # read -r var val 00:08:29.904 13:20:48 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:29.904 13:20:48 -- accel/accel.sh@12 -- # build_accel_config 00:08:29.904 13:20:48 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:29.904 13:20:48 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:29.904 13:20:48 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:29.904 13:20:48 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:29.904 13:20:48 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:29.904 13:20:48 -- accel/accel.sh@41 -- # local IFS=, 00:08:29.904 13:20:48 -- accel/accel.sh@42 -- # jq -r . 00:08:29.904 [2024-07-24 13:20:48.505124] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:29.904 [2024-07-24 13:20:48.505225] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3159311 ] 00:08:29.904 EAL: No free 2048 kB hugepages reported on node 1 00:08:29.904 [2024-07-24 13:20:48.625934] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:29.904 [2024-07-24 13:20:48.672856] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.904 13:20:48 -- accel/accel.sh@21 -- # val= 00:08:29.904 13:20:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # IFS=: 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # read -r var val 00:08:29.904 13:20:48 -- accel/accel.sh@21 -- # val= 00:08:29.904 13:20:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # IFS=: 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # read -r var val 00:08:29.904 13:20:48 -- accel/accel.sh@21 -- # val=0x1 00:08:29.904 13:20:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # IFS=: 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # read -r var val 00:08:29.904 13:20:48 -- accel/accel.sh@21 -- # val= 00:08:29.904 13:20:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # IFS=: 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # read -r var val 00:08:29.904 13:20:48 -- accel/accel.sh@21 -- # val= 00:08:29.904 13:20:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # IFS=: 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # read -r var val 00:08:29.904 13:20:48 -- accel/accel.sh@21 -- # val=fill 00:08:29.904 13:20:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.904 13:20:48 -- accel/accel.sh@24 -- # accel_opc=fill 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # IFS=: 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # read -r var val 00:08:29.904 13:20:48 -- accel/accel.sh@21 -- # val=0x80 00:08:29.904 13:20:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # IFS=: 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # read -r var val 00:08:29.904 13:20:48 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:29.904 13:20:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # IFS=: 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # read -r var val 00:08:29.904 13:20:48 -- accel/accel.sh@21 -- # val= 00:08:29.904 13:20:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # IFS=: 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # read -r var val 00:08:29.904 13:20:48 -- accel/accel.sh@21 -- # val=software 00:08:29.904 13:20:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.904 13:20:48 -- accel/accel.sh@23 -- # accel_module=software 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # IFS=: 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # read -r var val 00:08:29.904 13:20:48 -- accel/accel.sh@21 -- # val=64 00:08:29.904 13:20:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # IFS=: 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # read -r var val 00:08:29.904 13:20:48 -- accel/accel.sh@21 -- # val=64 00:08:29.904 13:20:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # IFS=: 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # read -r var val 00:08:29.904 13:20:48 -- accel/accel.sh@21 -- # val=1 00:08:29.904 13:20:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # IFS=: 00:08:29.904 13:20:48 -- accel/accel.sh@20 -- # read -r var val 00:08:29.905 13:20:48 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:29.905 13:20:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.905 13:20:48 -- accel/accel.sh@20 -- # IFS=: 00:08:29.905 13:20:48 -- accel/accel.sh@20 -- # read -r var val 00:08:29.905 13:20:48 -- accel/accel.sh@21 -- # val=Yes 00:08:29.905 13:20:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.905 13:20:48 -- accel/accel.sh@20 -- # IFS=: 00:08:29.905 13:20:48 -- accel/accel.sh@20 -- # read -r var val 00:08:29.905 13:20:48 -- accel/accel.sh@21 -- # val= 00:08:29.905 13:20:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.905 13:20:48 -- accel/accel.sh@20 -- # IFS=: 00:08:29.905 13:20:48 -- accel/accel.sh@20 -- # read -r var val 00:08:29.905 13:20:48 -- accel/accel.sh@21 -- # val= 00:08:29.905 13:20:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.905 13:20:48 -- accel/accel.sh@20 -- # IFS=: 00:08:29.905 13:20:48 -- accel/accel.sh@20 -- # read -r var val 00:08:31.280 13:20:49 -- accel/accel.sh@21 -- # val= 00:08:31.280 13:20:49 -- accel/accel.sh@22 -- # case "$var" in 00:08:31.280 13:20:49 -- accel/accel.sh@20 -- # IFS=: 00:08:31.280 13:20:49 -- accel/accel.sh@20 -- # read -r var val 00:08:31.280 13:20:49 -- accel/accel.sh@21 -- # val= 00:08:31.280 13:20:49 -- accel/accel.sh@22 -- # case "$var" in 00:08:31.280 13:20:49 -- accel/accel.sh@20 -- # IFS=: 00:08:31.280 13:20:49 -- accel/accel.sh@20 -- # read -r var val 00:08:31.280 13:20:49 -- accel/accel.sh@21 -- # val= 00:08:31.280 13:20:49 -- accel/accel.sh@22 -- # case "$var" in 00:08:31.280 13:20:49 -- accel/accel.sh@20 -- # IFS=: 00:08:31.280 13:20:49 -- accel/accel.sh@20 -- # read -r var val 00:08:31.280 13:20:49 -- accel/accel.sh@21 -- # val= 00:08:31.280 13:20:49 -- accel/accel.sh@22 -- # case "$var" in 00:08:31.280 13:20:49 -- accel/accel.sh@20 -- # IFS=: 00:08:31.280 13:20:49 -- accel/accel.sh@20 -- # read -r var val 00:08:31.280 13:20:49 -- accel/accel.sh@21 -- # val= 00:08:31.280 13:20:49 -- accel/accel.sh@22 -- # case "$var" in 00:08:31.280 13:20:49 -- accel/accel.sh@20 -- # IFS=: 00:08:31.280 13:20:49 -- accel/accel.sh@20 -- # read -r var val 00:08:31.280 13:20:49 -- accel/accel.sh@21 -- # val= 00:08:31.280 13:20:49 -- accel/accel.sh@22 -- # case "$var" in 00:08:31.280 13:20:49 -- accel/accel.sh@20 -- # IFS=: 00:08:31.280 13:20:49 -- accel/accel.sh@20 -- # read -r var val 00:08:31.280 13:20:49 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:31.280 13:20:49 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:08:31.280 13:20:49 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:31.280 00:08:31.280 real 0m2.771s 00:08:31.280 user 0m2.408s 00:08:31.280 sys 0m0.360s 00:08:31.280 13:20:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:31.281 13:20:49 -- common/autotest_common.sh@10 -- # set +x 00:08:31.281 ************************************ 00:08:31.281 END TEST accel_fill 00:08:31.281 ************************************ 00:08:31.281 13:20:49 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:08:31.281 13:20:49 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:08:31.281 13:20:49 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:31.281 13:20:49 -- common/autotest_common.sh@10 -- # set +x 00:08:31.281 ************************************ 00:08:31.281 START TEST accel_copy_crc32c 00:08:31.281 ************************************ 00:08:31.281 13:20:49 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y 00:08:31.281 13:20:49 -- accel/accel.sh@16 -- # local accel_opc 00:08:31.281 13:20:49 -- accel/accel.sh@17 -- # local accel_module 00:08:31.281 13:20:49 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:08:31.281 13:20:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:08:31.281 13:20:49 -- accel/accel.sh@12 -- # build_accel_config 00:08:31.281 13:20:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:31.281 13:20:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:31.281 13:20:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:31.281 13:20:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:31.281 13:20:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:31.281 13:20:49 -- accel/accel.sh@41 -- # local IFS=, 00:08:31.281 13:20:49 -- accel/accel.sh@42 -- # jq -r . 00:08:31.281 [2024-07-24 13:20:49.941628] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:31.281 [2024-07-24 13:20:49.941726] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3159515 ] 00:08:31.281 EAL: No free 2048 kB hugepages reported on node 1 00:08:31.281 [2024-07-24 13:20:50.066958] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:31.281 [2024-07-24 13:20:50.118564] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.657 13:20:51 -- accel/accel.sh@18 -- # out=' 00:08:32.657 SPDK Configuration: 00:08:32.657 Core mask: 0x1 00:08:32.657 00:08:32.657 Accel Perf Configuration: 00:08:32.657 Workload Type: copy_crc32c 00:08:32.657 CRC-32C seed: 0 00:08:32.657 Vector size: 4096 bytes 00:08:32.657 Transfer size: 4096 bytes 00:08:32.657 Vector count 1 00:08:32.657 Module: software 00:08:32.657 Queue depth: 32 00:08:32.657 Allocate depth: 32 00:08:32.657 # threads/core: 1 00:08:32.657 Run time: 1 seconds 00:08:32.657 Verify: Yes 00:08:32.657 00:08:32.657 Running for 1 seconds... 00:08:32.657 00:08:32.657 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:32.658 ------------------------------------------------------------------------------------ 00:08:32.658 0,0 258624/s 1010 MiB/s 0 0 00:08:32.658 ==================================================================================== 00:08:32.658 Total 258624/s 1010 MiB/s 0 0' 00:08:32.658 13:20:51 -- accel/accel.sh@20 -- # IFS=: 00:08:32.658 13:20:51 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:08:32.658 13:20:51 -- accel/accel.sh@20 -- # read -r var val 00:08:32.658 13:20:51 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:08:32.658 13:20:51 -- accel/accel.sh@12 -- # build_accel_config 00:08:32.658 13:20:51 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:32.658 13:20:51 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:32.658 13:20:51 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:32.658 13:20:51 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:32.658 13:20:51 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:32.658 13:20:51 -- accel/accel.sh@41 -- # local IFS=, 00:08:32.658 13:20:51 -- accel/accel.sh@42 -- # jq -r . 00:08:32.658 [2024-07-24 13:20:51.326867] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:32.658 [2024-07-24 13:20:51.326924] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3159695 ] 00:08:32.658 EAL: No free 2048 kB hugepages reported on node 1 00:08:32.658 [2024-07-24 13:20:51.426522] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:32.658 [2024-07-24 13:20:51.473816] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.917 13:20:51 -- accel/accel.sh@21 -- # val= 00:08:32.917 13:20:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # IFS=: 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # read -r var val 00:08:32.917 13:20:51 -- accel/accel.sh@21 -- # val= 00:08:32.917 13:20:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # IFS=: 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # read -r var val 00:08:32.917 13:20:51 -- accel/accel.sh@21 -- # val=0x1 00:08:32.917 13:20:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # IFS=: 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # read -r var val 00:08:32.917 13:20:51 -- accel/accel.sh@21 -- # val= 00:08:32.917 13:20:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # IFS=: 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # read -r var val 00:08:32.917 13:20:51 -- accel/accel.sh@21 -- # val= 00:08:32.917 13:20:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # IFS=: 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # read -r var val 00:08:32.917 13:20:51 -- accel/accel.sh@21 -- # val=copy_crc32c 00:08:32.917 13:20:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.917 13:20:51 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # IFS=: 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # read -r var val 00:08:32.917 13:20:51 -- accel/accel.sh@21 -- # val=0 00:08:32.917 13:20:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # IFS=: 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # read -r var val 00:08:32.917 13:20:51 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:32.917 13:20:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # IFS=: 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # read -r var val 00:08:32.917 13:20:51 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:32.917 13:20:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # IFS=: 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # read -r var val 00:08:32.917 13:20:51 -- accel/accel.sh@21 -- # val= 00:08:32.917 13:20:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # IFS=: 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # read -r var val 00:08:32.917 13:20:51 -- accel/accel.sh@21 -- # val=software 00:08:32.917 13:20:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.917 13:20:51 -- accel/accel.sh@23 -- # accel_module=software 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # IFS=: 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # read -r var val 00:08:32.917 13:20:51 -- accel/accel.sh@21 -- # val=32 00:08:32.917 13:20:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # IFS=: 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # read -r var val 00:08:32.917 13:20:51 -- accel/accel.sh@21 -- # val=32 00:08:32.917 13:20:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # IFS=: 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # read -r var val 00:08:32.917 13:20:51 -- accel/accel.sh@21 -- # val=1 00:08:32.917 13:20:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # IFS=: 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # read -r var val 00:08:32.917 13:20:51 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:32.917 13:20:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # IFS=: 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # read -r var val 00:08:32.917 13:20:51 -- accel/accel.sh@21 -- # val=Yes 00:08:32.917 13:20:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # IFS=: 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # read -r var val 00:08:32.917 13:20:51 -- accel/accel.sh@21 -- # val= 00:08:32.917 13:20:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # IFS=: 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # read -r var val 00:08:32.917 13:20:51 -- accel/accel.sh@21 -- # val= 00:08:32.917 13:20:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # IFS=: 00:08:32.917 13:20:51 -- accel/accel.sh@20 -- # read -r var val 00:08:33.854 13:20:52 -- accel/accel.sh@21 -- # val= 00:08:33.854 13:20:52 -- accel/accel.sh@22 -- # case "$var" in 00:08:33.854 13:20:52 -- accel/accel.sh@20 -- # IFS=: 00:08:33.854 13:20:52 -- accel/accel.sh@20 -- # read -r var val 00:08:33.854 13:20:52 -- accel/accel.sh@21 -- # val= 00:08:33.854 13:20:52 -- accel/accel.sh@22 -- # case "$var" in 00:08:33.854 13:20:52 -- accel/accel.sh@20 -- # IFS=: 00:08:33.854 13:20:52 -- accel/accel.sh@20 -- # read -r var val 00:08:33.854 13:20:52 -- accel/accel.sh@21 -- # val= 00:08:33.854 13:20:52 -- accel/accel.sh@22 -- # case "$var" in 00:08:33.854 13:20:52 -- accel/accel.sh@20 -- # IFS=: 00:08:33.854 13:20:52 -- accel/accel.sh@20 -- # read -r var val 00:08:33.854 13:20:52 -- accel/accel.sh@21 -- # val= 00:08:33.854 13:20:52 -- accel/accel.sh@22 -- # case "$var" in 00:08:33.854 13:20:52 -- accel/accel.sh@20 -- # IFS=: 00:08:33.854 13:20:52 -- accel/accel.sh@20 -- # read -r var val 00:08:33.854 13:20:52 -- accel/accel.sh@21 -- # val= 00:08:33.854 13:20:52 -- accel/accel.sh@22 -- # case "$var" in 00:08:33.854 13:20:52 -- accel/accel.sh@20 -- # IFS=: 00:08:33.854 13:20:52 -- accel/accel.sh@20 -- # read -r var val 00:08:33.854 13:20:52 -- accel/accel.sh@21 -- # val= 00:08:33.854 13:20:52 -- accel/accel.sh@22 -- # case "$var" in 00:08:33.854 13:20:52 -- accel/accel.sh@20 -- # IFS=: 00:08:33.854 13:20:52 -- accel/accel.sh@20 -- # read -r var val 00:08:33.854 13:20:52 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:33.854 13:20:52 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:08:33.854 13:20:52 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:33.854 00:08:33.854 real 0m2.755s 00:08:33.854 user 0m2.396s 00:08:33.854 sys 0m0.355s 00:08:33.854 13:20:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:33.854 13:20:52 -- common/autotest_common.sh@10 -- # set +x 00:08:33.854 ************************************ 00:08:33.854 END TEST accel_copy_crc32c 00:08:33.854 ************************************ 00:08:33.854 13:20:52 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:08:33.854 13:20:52 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:08:33.854 13:20:52 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:33.854 13:20:52 -- common/autotest_common.sh@10 -- # set +x 00:08:33.854 ************************************ 00:08:33.854 START TEST accel_copy_crc32c_C2 00:08:33.854 ************************************ 00:08:33.854 13:20:52 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:08:33.854 13:20:52 -- accel/accel.sh@16 -- # local accel_opc 00:08:33.854 13:20:52 -- accel/accel.sh@17 -- # local accel_module 00:08:33.854 13:20:52 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:08:33.854 13:20:52 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:08:34.113 13:20:52 -- accel/accel.sh@12 -- # build_accel_config 00:08:34.113 13:20:52 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:34.113 13:20:52 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:34.113 13:20:52 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:34.113 13:20:52 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:34.113 13:20:52 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:34.113 13:20:52 -- accel/accel.sh@41 -- # local IFS=, 00:08:34.113 13:20:52 -- accel/accel.sh@42 -- # jq -r . 00:08:34.113 [2024-07-24 13:20:52.737435] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:34.113 [2024-07-24 13:20:52.737542] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3159890 ] 00:08:34.113 EAL: No free 2048 kB hugepages reported on node 1 00:08:34.113 [2024-07-24 13:20:52.859705] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.113 [2024-07-24 13:20:52.908657] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.492 13:20:54 -- accel/accel.sh@18 -- # out=' 00:08:35.492 SPDK Configuration: 00:08:35.492 Core mask: 0x1 00:08:35.492 00:08:35.492 Accel Perf Configuration: 00:08:35.492 Workload Type: copy_crc32c 00:08:35.492 CRC-32C seed: 0 00:08:35.492 Vector size: 4096 bytes 00:08:35.492 Transfer size: 8192 bytes 00:08:35.492 Vector count 2 00:08:35.492 Module: software 00:08:35.492 Queue depth: 32 00:08:35.492 Allocate depth: 32 00:08:35.492 # threads/core: 1 00:08:35.492 Run time: 1 seconds 00:08:35.492 Verify: Yes 00:08:35.492 00:08:35.492 Running for 1 seconds... 00:08:35.492 00:08:35.492 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:35.492 ------------------------------------------------------------------------------------ 00:08:35.492 0,0 183328/s 1432 MiB/s 0 0 00:08:35.492 ==================================================================================== 00:08:35.492 Total 183328/s 716 MiB/s 0 0' 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # IFS=: 00:08:35.492 13:20:54 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # read -r var val 00:08:35.492 13:20:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:08:35.492 13:20:54 -- accel/accel.sh@12 -- # build_accel_config 00:08:35.492 13:20:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:35.492 13:20:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:35.492 13:20:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:35.492 13:20:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:35.492 13:20:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:35.492 13:20:54 -- accel/accel.sh@41 -- # local IFS=, 00:08:35.492 13:20:54 -- accel/accel.sh@42 -- # jq -r . 00:08:35.492 [2024-07-24 13:20:54.114613] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:35.492 [2024-07-24 13:20:54.114671] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3160077 ] 00:08:35.492 EAL: No free 2048 kB hugepages reported on node 1 00:08:35.492 [2024-07-24 13:20:54.215954] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:35.492 [2024-07-24 13:20:54.262752] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.492 13:20:54 -- accel/accel.sh@21 -- # val= 00:08:35.492 13:20:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # IFS=: 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # read -r var val 00:08:35.492 13:20:54 -- accel/accel.sh@21 -- # val= 00:08:35.492 13:20:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # IFS=: 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # read -r var val 00:08:35.492 13:20:54 -- accel/accel.sh@21 -- # val=0x1 00:08:35.492 13:20:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # IFS=: 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # read -r var val 00:08:35.492 13:20:54 -- accel/accel.sh@21 -- # val= 00:08:35.492 13:20:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # IFS=: 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # read -r var val 00:08:35.492 13:20:54 -- accel/accel.sh@21 -- # val= 00:08:35.492 13:20:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # IFS=: 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # read -r var val 00:08:35.492 13:20:54 -- accel/accel.sh@21 -- # val=copy_crc32c 00:08:35.492 13:20:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.492 13:20:54 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # IFS=: 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # read -r var val 00:08:35.492 13:20:54 -- accel/accel.sh@21 -- # val=0 00:08:35.492 13:20:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # IFS=: 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # read -r var val 00:08:35.492 13:20:54 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:35.492 13:20:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # IFS=: 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # read -r var val 00:08:35.492 13:20:54 -- accel/accel.sh@21 -- # val='8192 bytes' 00:08:35.492 13:20:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # IFS=: 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # read -r var val 00:08:35.492 13:20:54 -- accel/accel.sh@21 -- # val= 00:08:35.492 13:20:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # IFS=: 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # read -r var val 00:08:35.492 13:20:54 -- accel/accel.sh@21 -- # val=software 00:08:35.492 13:20:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.492 13:20:54 -- accel/accel.sh@23 -- # accel_module=software 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # IFS=: 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # read -r var val 00:08:35.492 13:20:54 -- accel/accel.sh@21 -- # val=32 00:08:35.492 13:20:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # IFS=: 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # read -r var val 00:08:35.492 13:20:54 -- accel/accel.sh@21 -- # val=32 00:08:35.492 13:20:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # IFS=: 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # read -r var val 00:08:35.492 13:20:54 -- accel/accel.sh@21 -- # val=1 00:08:35.492 13:20:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # IFS=: 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # read -r var val 00:08:35.492 13:20:54 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:35.492 13:20:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # IFS=: 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # read -r var val 00:08:35.492 13:20:54 -- accel/accel.sh@21 -- # val=Yes 00:08:35.492 13:20:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # IFS=: 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # read -r var val 00:08:35.492 13:20:54 -- accel/accel.sh@21 -- # val= 00:08:35.492 13:20:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # IFS=: 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # read -r var val 00:08:35.492 13:20:54 -- accel/accel.sh@21 -- # val= 00:08:35.492 13:20:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # IFS=: 00:08:35.492 13:20:54 -- accel/accel.sh@20 -- # read -r var val 00:08:36.869 13:20:55 -- accel/accel.sh@21 -- # val= 00:08:36.869 13:20:55 -- accel/accel.sh@22 -- # case "$var" in 00:08:36.869 13:20:55 -- accel/accel.sh@20 -- # IFS=: 00:08:36.869 13:20:55 -- accel/accel.sh@20 -- # read -r var val 00:08:36.869 13:20:55 -- accel/accel.sh@21 -- # val= 00:08:36.869 13:20:55 -- accel/accel.sh@22 -- # case "$var" in 00:08:36.869 13:20:55 -- accel/accel.sh@20 -- # IFS=: 00:08:36.869 13:20:55 -- accel/accel.sh@20 -- # read -r var val 00:08:36.869 13:20:55 -- accel/accel.sh@21 -- # val= 00:08:36.869 13:20:55 -- accel/accel.sh@22 -- # case "$var" in 00:08:36.869 13:20:55 -- accel/accel.sh@20 -- # IFS=: 00:08:36.869 13:20:55 -- accel/accel.sh@20 -- # read -r var val 00:08:36.869 13:20:55 -- accel/accel.sh@21 -- # val= 00:08:36.869 13:20:55 -- accel/accel.sh@22 -- # case "$var" in 00:08:36.869 13:20:55 -- accel/accel.sh@20 -- # IFS=: 00:08:36.869 13:20:55 -- accel/accel.sh@20 -- # read -r var val 00:08:36.869 13:20:55 -- accel/accel.sh@21 -- # val= 00:08:36.869 13:20:55 -- accel/accel.sh@22 -- # case "$var" in 00:08:36.869 13:20:55 -- accel/accel.sh@20 -- # IFS=: 00:08:36.869 13:20:55 -- accel/accel.sh@20 -- # read -r var val 00:08:36.869 13:20:55 -- accel/accel.sh@21 -- # val= 00:08:36.869 13:20:55 -- accel/accel.sh@22 -- # case "$var" in 00:08:36.869 13:20:55 -- accel/accel.sh@20 -- # IFS=: 00:08:36.869 13:20:55 -- accel/accel.sh@20 -- # read -r var val 00:08:36.869 13:20:55 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:36.869 13:20:55 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:08:36.869 13:20:55 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:36.869 00:08:36.869 real 0m2.747s 00:08:36.869 user 0m2.393s 00:08:36.869 sys 0m0.349s 00:08:36.869 13:20:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:36.869 13:20:55 -- common/autotest_common.sh@10 -- # set +x 00:08:36.869 ************************************ 00:08:36.869 END TEST accel_copy_crc32c_C2 00:08:36.869 ************************************ 00:08:36.869 13:20:55 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:08:36.869 13:20:55 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:08:36.869 13:20:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:36.869 13:20:55 -- common/autotest_common.sh@10 -- # set +x 00:08:36.869 ************************************ 00:08:36.869 START TEST accel_dualcast 00:08:36.869 ************************************ 00:08:36.869 13:20:55 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dualcast -y 00:08:36.869 13:20:55 -- accel/accel.sh@16 -- # local accel_opc 00:08:36.869 13:20:55 -- accel/accel.sh@17 -- # local accel_module 00:08:36.869 13:20:55 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:08:36.869 13:20:55 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:08:36.869 13:20:55 -- accel/accel.sh@12 -- # build_accel_config 00:08:36.869 13:20:55 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:36.869 13:20:55 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:36.869 13:20:55 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:36.869 13:20:55 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:36.869 13:20:55 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:36.869 13:20:55 -- accel/accel.sh@41 -- # local IFS=, 00:08:36.869 13:20:55 -- accel/accel.sh@42 -- # jq -r . 00:08:36.869 [2024-07-24 13:20:55.524486] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:36.869 [2024-07-24 13:20:55.524606] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3160272 ] 00:08:36.869 EAL: No free 2048 kB hugepages reported on node 1 00:08:36.869 [2024-07-24 13:20:55.643729] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.869 [2024-07-24 13:20:55.690982] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.247 13:20:56 -- accel/accel.sh@18 -- # out=' 00:08:38.247 SPDK Configuration: 00:08:38.247 Core mask: 0x1 00:08:38.247 00:08:38.247 Accel Perf Configuration: 00:08:38.247 Workload Type: dualcast 00:08:38.247 Transfer size: 4096 bytes 00:08:38.247 Vector count 1 00:08:38.247 Module: software 00:08:38.247 Queue depth: 32 00:08:38.247 Allocate depth: 32 00:08:38.247 # threads/core: 1 00:08:38.247 Run time: 1 seconds 00:08:38.247 Verify: Yes 00:08:38.247 00:08:38.247 Running for 1 seconds... 00:08:38.247 00:08:38.247 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:38.247 ------------------------------------------------------------------------------------ 00:08:38.247 0,0 407072/s 1590 MiB/s 0 0 00:08:38.247 ==================================================================================== 00:08:38.247 Total 407072/s 1590 MiB/s 0 0' 00:08:38.247 13:20:56 -- accel/accel.sh@20 -- # IFS=: 00:08:38.247 13:20:56 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:08:38.247 13:20:56 -- accel/accel.sh@20 -- # read -r var val 00:08:38.247 13:20:56 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:08:38.247 13:20:56 -- accel/accel.sh@12 -- # build_accel_config 00:08:38.247 13:20:56 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:38.247 13:20:56 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:38.247 13:20:56 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:38.247 13:20:56 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:38.247 13:20:56 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:38.247 13:20:56 -- accel/accel.sh@41 -- # local IFS=, 00:08:38.247 13:20:56 -- accel/accel.sh@42 -- # jq -r . 00:08:38.247 [2024-07-24 13:20:56.896732] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:38.247 [2024-07-24 13:20:56.896795] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3160450 ] 00:08:38.247 EAL: No free 2048 kB hugepages reported on node 1 00:08:38.247 [2024-07-24 13:20:56.995164] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:38.247 [2024-07-24 13:20:57.042407] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.247 13:20:57 -- accel/accel.sh@21 -- # val= 00:08:38.247 13:20:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # IFS=: 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # read -r var val 00:08:38.247 13:20:57 -- accel/accel.sh@21 -- # val= 00:08:38.247 13:20:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # IFS=: 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # read -r var val 00:08:38.247 13:20:57 -- accel/accel.sh@21 -- # val=0x1 00:08:38.247 13:20:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # IFS=: 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # read -r var val 00:08:38.247 13:20:57 -- accel/accel.sh@21 -- # val= 00:08:38.247 13:20:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # IFS=: 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # read -r var val 00:08:38.247 13:20:57 -- accel/accel.sh@21 -- # val= 00:08:38.247 13:20:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # IFS=: 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # read -r var val 00:08:38.247 13:20:57 -- accel/accel.sh@21 -- # val=dualcast 00:08:38.247 13:20:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:38.247 13:20:57 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # IFS=: 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # read -r var val 00:08:38.247 13:20:57 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:38.247 13:20:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # IFS=: 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # read -r var val 00:08:38.247 13:20:57 -- accel/accel.sh@21 -- # val= 00:08:38.247 13:20:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # IFS=: 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # read -r var val 00:08:38.247 13:20:57 -- accel/accel.sh@21 -- # val=software 00:08:38.247 13:20:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:38.247 13:20:57 -- accel/accel.sh@23 -- # accel_module=software 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # IFS=: 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # read -r var val 00:08:38.247 13:20:57 -- accel/accel.sh@21 -- # val=32 00:08:38.247 13:20:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # IFS=: 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # read -r var val 00:08:38.247 13:20:57 -- accel/accel.sh@21 -- # val=32 00:08:38.247 13:20:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # IFS=: 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # read -r var val 00:08:38.247 13:20:57 -- accel/accel.sh@21 -- # val=1 00:08:38.247 13:20:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # IFS=: 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # read -r var val 00:08:38.247 13:20:57 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:38.247 13:20:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # IFS=: 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # read -r var val 00:08:38.247 13:20:57 -- accel/accel.sh@21 -- # val=Yes 00:08:38.247 13:20:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # IFS=: 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # read -r var val 00:08:38.247 13:20:57 -- accel/accel.sh@21 -- # val= 00:08:38.247 13:20:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # IFS=: 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # read -r var val 00:08:38.247 13:20:57 -- accel/accel.sh@21 -- # val= 00:08:38.247 13:20:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # IFS=: 00:08:38.247 13:20:57 -- accel/accel.sh@20 -- # read -r var val 00:08:39.624 13:20:58 -- accel/accel.sh@21 -- # val= 00:08:39.624 13:20:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:39.624 13:20:58 -- accel/accel.sh@20 -- # IFS=: 00:08:39.624 13:20:58 -- accel/accel.sh@20 -- # read -r var val 00:08:39.624 13:20:58 -- accel/accel.sh@21 -- # val= 00:08:39.624 13:20:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:39.624 13:20:58 -- accel/accel.sh@20 -- # IFS=: 00:08:39.624 13:20:58 -- accel/accel.sh@20 -- # read -r var val 00:08:39.624 13:20:58 -- accel/accel.sh@21 -- # val= 00:08:39.624 13:20:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:39.624 13:20:58 -- accel/accel.sh@20 -- # IFS=: 00:08:39.624 13:20:58 -- accel/accel.sh@20 -- # read -r var val 00:08:39.624 13:20:58 -- accel/accel.sh@21 -- # val= 00:08:39.624 13:20:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:39.624 13:20:58 -- accel/accel.sh@20 -- # IFS=: 00:08:39.624 13:20:58 -- accel/accel.sh@20 -- # read -r var val 00:08:39.624 13:20:58 -- accel/accel.sh@21 -- # val= 00:08:39.624 13:20:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:39.624 13:20:58 -- accel/accel.sh@20 -- # IFS=: 00:08:39.624 13:20:58 -- accel/accel.sh@20 -- # read -r var val 00:08:39.624 13:20:58 -- accel/accel.sh@21 -- # val= 00:08:39.624 13:20:58 -- accel/accel.sh@22 -- # case "$var" in 00:08:39.624 13:20:58 -- accel/accel.sh@20 -- # IFS=: 00:08:39.624 13:20:58 -- accel/accel.sh@20 -- # read -r var val 00:08:39.624 13:20:58 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:39.624 13:20:58 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:08:39.624 13:20:58 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:39.624 00:08:39.624 real 0m2.735s 00:08:39.624 user 0m2.402s 00:08:39.624 sys 0m0.328s 00:08:39.624 13:20:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:39.624 13:20:58 -- common/autotest_common.sh@10 -- # set +x 00:08:39.624 ************************************ 00:08:39.624 END TEST accel_dualcast 00:08:39.624 ************************************ 00:08:39.624 13:20:58 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:08:39.624 13:20:58 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:08:39.624 13:20:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:39.624 13:20:58 -- common/autotest_common.sh@10 -- # set +x 00:08:39.624 ************************************ 00:08:39.624 START TEST accel_compare 00:08:39.624 ************************************ 00:08:39.624 13:20:58 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compare -y 00:08:39.624 13:20:58 -- accel/accel.sh@16 -- # local accel_opc 00:08:39.624 13:20:58 -- accel/accel.sh@17 -- # local accel_module 00:08:39.624 13:20:58 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:08:39.624 13:20:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:08:39.624 13:20:58 -- accel/accel.sh@12 -- # build_accel_config 00:08:39.624 13:20:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:39.624 13:20:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:39.624 13:20:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:39.624 13:20:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:39.624 13:20:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:39.624 13:20:58 -- accel/accel.sh@41 -- # local IFS=, 00:08:39.624 13:20:58 -- accel/accel.sh@42 -- # jq -r . 00:08:39.624 [2024-07-24 13:20:58.302735] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:39.624 [2024-07-24 13:20:58.302829] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3160655 ] 00:08:39.624 EAL: No free 2048 kB hugepages reported on node 1 00:08:39.624 [2024-07-24 13:20:58.424258] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.624 [2024-07-24 13:20:58.467765] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:41.002 13:20:59 -- accel/accel.sh@18 -- # out=' 00:08:41.002 SPDK Configuration: 00:08:41.002 Core mask: 0x1 00:08:41.002 00:08:41.002 Accel Perf Configuration: 00:08:41.002 Workload Type: compare 00:08:41.002 Transfer size: 4096 bytes 00:08:41.002 Vector count 1 00:08:41.002 Module: software 00:08:41.002 Queue depth: 32 00:08:41.002 Allocate depth: 32 00:08:41.002 # threads/core: 1 00:08:41.002 Run time: 1 seconds 00:08:41.002 Verify: Yes 00:08:41.002 00:08:41.002 Running for 1 seconds... 00:08:41.002 00:08:41.002 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:41.002 ------------------------------------------------------------------------------------ 00:08:41.002 0,0 502304/s 1962 MiB/s 0 0 00:08:41.002 ==================================================================================== 00:08:41.002 Total 502304/s 1962 MiB/s 0 0' 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # IFS=: 00:08:41.002 13:20:59 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # read -r var val 00:08:41.002 13:20:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:08:41.002 13:20:59 -- accel/accel.sh@12 -- # build_accel_config 00:08:41.002 13:20:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:41.002 13:20:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:41.002 13:20:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:41.002 13:20:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:41.002 13:20:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:41.002 13:20:59 -- accel/accel.sh@41 -- # local IFS=, 00:08:41.002 13:20:59 -- accel/accel.sh@42 -- # jq -r . 00:08:41.002 [2024-07-24 13:20:59.665406] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:41.002 [2024-07-24 13:20:59.665462] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3160837 ] 00:08:41.002 EAL: No free 2048 kB hugepages reported on node 1 00:08:41.002 [2024-07-24 13:20:59.765785] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:41.002 [2024-07-24 13:20:59.809341] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:41.002 13:20:59 -- accel/accel.sh@21 -- # val= 00:08:41.002 13:20:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # IFS=: 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # read -r var val 00:08:41.002 13:20:59 -- accel/accel.sh@21 -- # val= 00:08:41.002 13:20:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # IFS=: 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # read -r var val 00:08:41.002 13:20:59 -- accel/accel.sh@21 -- # val=0x1 00:08:41.002 13:20:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # IFS=: 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # read -r var val 00:08:41.002 13:20:59 -- accel/accel.sh@21 -- # val= 00:08:41.002 13:20:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # IFS=: 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # read -r var val 00:08:41.002 13:20:59 -- accel/accel.sh@21 -- # val= 00:08:41.002 13:20:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # IFS=: 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # read -r var val 00:08:41.002 13:20:59 -- accel/accel.sh@21 -- # val=compare 00:08:41.002 13:20:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.002 13:20:59 -- accel/accel.sh@24 -- # accel_opc=compare 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # IFS=: 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # read -r var val 00:08:41.002 13:20:59 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:41.002 13:20:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # IFS=: 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # read -r var val 00:08:41.002 13:20:59 -- accel/accel.sh@21 -- # val= 00:08:41.002 13:20:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # IFS=: 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # read -r var val 00:08:41.002 13:20:59 -- accel/accel.sh@21 -- # val=software 00:08:41.002 13:20:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.002 13:20:59 -- accel/accel.sh@23 -- # accel_module=software 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # IFS=: 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # read -r var val 00:08:41.002 13:20:59 -- accel/accel.sh@21 -- # val=32 00:08:41.002 13:20:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # IFS=: 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # read -r var val 00:08:41.002 13:20:59 -- accel/accel.sh@21 -- # val=32 00:08:41.002 13:20:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # IFS=: 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # read -r var val 00:08:41.002 13:20:59 -- accel/accel.sh@21 -- # val=1 00:08:41.002 13:20:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # IFS=: 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # read -r var val 00:08:41.002 13:20:59 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:41.002 13:20:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # IFS=: 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # read -r var val 00:08:41.002 13:20:59 -- accel/accel.sh@21 -- # val=Yes 00:08:41.002 13:20:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # IFS=: 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # read -r var val 00:08:41.002 13:20:59 -- accel/accel.sh@21 -- # val= 00:08:41.002 13:20:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # IFS=: 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # read -r var val 00:08:41.002 13:20:59 -- accel/accel.sh@21 -- # val= 00:08:41.002 13:20:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # IFS=: 00:08:41.002 13:20:59 -- accel/accel.sh@20 -- # read -r var val 00:08:42.380 13:21:00 -- accel/accel.sh@21 -- # val= 00:08:42.380 13:21:00 -- accel/accel.sh@22 -- # case "$var" in 00:08:42.380 13:21:00 -- accel/accel.sh@20 -- # IFS=: 00:08:42.380 13:21:00 -- accel/accel.sh@20 -- # read -r var val 00:08:42.380 13:21:00 -- accel/accel.sh@21 -- # val= 00:08:42.380 13:21:00 -- accel/accel.sh@22 -- # case "$var" in 00:08:42.380 13:21:00 -- accel/accel.sh@20 -- # IFS=: 00:08:42.380 13:21:00 -- accel/accel.sh@20 -- # read -r var val 00:08:42.380 13:21:00 -- accel/accel.sh@21 -- # val= 00:08:42.380 13:21:00 -- accel/accel.sh@22 -- # case "$var" in 00:08:42.380 13:21:00 -- accel/accel.sh@20 -- # IFS=: 00:08:42.380 13:21:00 -- accel/accel.sh@20 -- # read -r var val 00:08:42.380 13:21:00 -- accel/accel.sh@21 -- # val= 00:08:42.380 13:21:00 -- accel/accel.sh@22 -- # case "$var" in 00:08:42.380 13:21:00 -- accel/accel.sh@20 -- # IFS=: 00:08:42.380 13:21:00 -- accel/accel.sh@20 -- # read -r var val 00:08:42.380 13:21:00 -- accel/accel.sh@21 -- # val= 00:08:42.380 13:21:00 -- accel/accel.sh@22 -- # case "$var" in 00:08:42.380 13:21:00 -- accel/accel.sh@20 -- # IFS=: 00:08:42.381 13:21:00 -- accel/accel.sh@20 -- # read -r var val 00:08:42.381 13:21:00 -- accel/accel.sh@21 -- # val= 00:08:42.381 13:21:00 -- accel/accel.sh@22 -- # case "$var" in 00:08:42.381 13:21:00 -- accel/accel.sh@20 -- # IFS=: 00:08:42.381 13:21:00 -- accel/accel.sh@20 -- # read -r var val 00:08:42.381 13:21:00 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:42.381 13:21:00 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:08:42.381 13:21:00 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:42.381 00:08:42.381 real 0m2.716s 00:08:42.381 user 0m2.392s 00:08:42.381 sys 0m0.320s 00:08:42.381 13:21:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:42.381 13:21:00 -- common/autotest_common.sh@10 -- # set +x 00:08:42.381 ************************************ 00:08:42.381 END TEST accel_compare 00:08:42.381 ************************************ 00:08:42.381 13:21:01 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:08:42.381 13:21:01 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:08:42.381 13:21:01 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:42.381 13:21:01 -- common/autotest_common.sh@10 -- # set +x 00:08:42.381 ************************************ 00:08:42.381 START TEST accel_xor 00:08:42.381 ************************************ 00:08:42.381 13:21:01 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y 00:08:42.381 13:21:01 -- accel/accel.sh@16 -- # local accel_opc 00:08:42.381 13:21:01 -- accel/accel.sh@17 -- # local accel_module 00:08:42.381 13:21:01 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:08:42.381 13:21:01 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:08:42.381 13:21:01 -- accel/accel.sh@12 -- # build_accel_config 00:08:42.381 13:21:01 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:42.381 13:21:01 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:42.381 13:21:01 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:42.381 13:21:01 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:42.381 13:21:01 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:42.381 13:21:01 -- accel/accel.sh@41 -- # local IFS=, 00:08:42.381 13:21:01 -- accel/accel.sh@42 -- # jq -r . 00:08:42.381 [2024-07-24 13:21:01.059599] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:42.381 [2024-07-24 13:21:01.059693] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3161094 ] 00:08:42.381 EAL: No free 2048 kB hugepages reported on node 1 00:08:42.381 [2024-07-24 13:21:01.180913] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:42.381 [2024-07-24 13:21:01.225179] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.759 13:21:02 -- accel/accel.sh@18 -- # out=' 00:08:43.759 SPDK Configuration: 00:08:43.759 Core mask: 0x1 00:08:43.759 00:08:43.759 Accel Perf Configuration: 00:08:43.759 Workload Type: xor 00:08:43.759 Source buffers: 2 00:08:43.759 Transfer size: 4096 bytes 00:08:43.759 Vector count 1 00:08:43.759 Module: software 00:08:43.759 Queue depth: 32 00:08:43.759 Allocate depth: 32 00:08:43.759 # threads/core: 1 00:08:43.759 Run time: 1 seconds 00:08:43.759 Verify: Yes 00:08:43.759 00:08:43.759 Running for 1 seconds... 00:08:43.759 00:08:43.759 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:43.760 ------------------------------------------------------------------------------------ 00:08:43.760 0,0 432320/s 1688 MiB/s 0 0 00:08:43.760 ==================================================================================== 00:08:43.760 Total 432320/s 1688 MiB/s 0 0' 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # IFS=: 00:08:43.760 13:21:02 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # read -r var val 00:08:43.760 13:21:02 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:08:43.760 13:21:02 -- accel/accel.sh@12 -- # build_accel_config 00:08:43.760 13:21:02 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:43.760 13:21:02 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:43.760 13:21:02 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:43.760 13:21:02 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:43.760 13:21:02 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:43.760 13:21:02 -- accel/accel.sh@41 -- # local IFS=, 00:08:43.760 13:21:02 -- accel/accel.sh@42 -- # jq -r . 00:08:43.760 [2024-07-24 13:21:02.411749] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:43.760 [2024-07-24 13:21:02.411806] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3161341 ] 00:08:43.760 EAL: No free 2048 kB hugepages reported on node 1 00:08:43.760 [2024-07-24 13:21:02.511430] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:43.760 [2024-07-24 13:21:02.556383] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.760 13:21:02 -- accel/accel.sh@21 -- # val= 00:08:43.760 13:21:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # IFS=: 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # read -r var val 00:08:43.760 13:21:02 -- accel/accel.sh@21 -- # val= 00:08:43.760 13:21:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # IFS=: 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # read -r var val 00:08:43.760 13:21:02 -- accel/accel.sh@21 -- # val=0x1 00:08:43.760 13:21:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # IFS=: 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # read -r var val 00:08:43.760 13:21:02 -- accel/accel.sh@21 -- # val= 00:08:43.760 13:21:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # IFS=: 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # read -r var val 00:08:43.760 13:21:02 -- accel/accel.sh@21 -- # val= 00:08:43.760 13:21:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # IFS=: 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # read -r var val 00:08:43.760 13:21:02 -- accel/accel.sh@21 -- # val=xor 00:08:43.760 13:21:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.760 13:21:02 -- accel/accel.sh@24 -- # accel_opc=xor 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # IFS=: 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # read -r var val 00:08:43.760 13:21:02 -- accel/accel.sh@21 -- # val=2 00:08:43.760 13:21:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # IFS=: 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # read -r var val 00:08:43.760 13:21:02 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:43.760 13:21:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # IFS=: 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # read -r var val 00:08:43.760 13:21:02 -- accel/accel.sh@21 -- # val= 00:08:43.760 13:21:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # IFS=: 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # read -r var val 00:08:43.760 13:21:02 -- accel/accel.sh@21 -- # val=software 00:08:43.760 13:21:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.760 13:21:02 -- accel/accel.sh@23 -- # accel_module=software 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # IFS=: 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # read -r var val 00:08:43.760 13:21:02 -- accel/accel.sh@21 -- # val=32 00:08:43.760 13:21:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # IFS=: 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # read -r var val 00:08:43.760 13:21:02 -- accel/accel.sh@21 -- # val=32 00:08:43.760 13:21:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # IFS=: 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # read -r var val 00:08:43.760 13:21:02 -- accel/accel.sh@21 -- # val=1 00:08:43.760 13:21:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # IFS=: 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # read -r var val 00:08:43.760 13:21:02 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:43.760 13:21:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # IFS=: 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # read -r var val 00:08:43.760 13:21:02 -- accel/accel.sh@21 -- # val=Yes 00:08:43.760 13:21:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # IFS=: 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # read -r var val 00:08:43.760 13:21:02 -- accel/accel.sh@21 -- # val= 00:08:43.760 13:21:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # IFS=: 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # read -r var val 00:08:43.760 13:21:02 -- accel/accel.sh@21 -- # val= 00:08:43.760 13:21:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # IFS=: 00:08:43.760 13:21:02 -- accel/accel.sh@20 -- # read -r var val 00:08:45.136 13:21:03 -- accel/accel.sh@21 -- # val= 00:08:45.136 13:21:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:45.136 13:21:03 -- accel/accel.sh@20 -- # IFS=: 00:08:45.136 13:21:03 -- accel/accel.sh@20 -- # read -r var val 00:08:45.136 13:21:03 -- accel/accel.sh@21 -- # val= 00:08:45.136 13:21:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:45.136 13:21:03 -- accel/accel.sh@20 -- # IFS=: 00:08:45.136 13:21:03 -- accel/accel.sh@20 -- # read -r var val 00:08:45.136 13:21:03 -- accel/accel.sh@21 -- # val= 00:08:45.136 13:21:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:45.136 13:21:03 -- accel/accel.sh@20 -- # IFS=: 00:08:45.136 13:21:03 -- accel/accel.sh@20 -- # read -r var val 00:08:45.136 13:21:03 -- accel/accel.sh@21 -- # val= 00:08:45.136 13:21:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:45.136 13:21:03 -- accel/accel.sh@20 -- # IFS=: 00:08:45.136 13:21:03 -- accel/accel.sh@20 -- # read -r var val 00:08:45.136 13:21:03 -- accel/accel.sh@21 -- # val= 00:08:45.136 13:21:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:45.136 13:21:03 -- accel/accel.sh@20 -- # IFS=: 00:08:45.136 13:21:03 -- accel/accel.sh@20 -- # read -r var val 00:08:45.136 13:21:03 -- accel/accel.sh@21 -- # val= 00:08:45.136 13:21:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:45.136 13:21:03 -- accel/accel.sh@20 -- # IFS=: 00:08:45.136 13:21:03 -- accel/accel.sh@20 -- # read -r var val 00:08:45.136 13:21:03 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:45.136 13:21:03 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:08:45.136 13:21:03 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:45.136 00:08:45.136 real 0m2.700s 00:08:45.136 user 0m2.363s 00:08:45.136 sys 0m0.332s 00:08:45.136 13:21:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:45.136 13:21:03 -- common/autotest_common.sh@10 -- # set +x 00:08:45.136 ************************************ 00:08:45.136 END TEST accel_xor 00:08:45.136 ************************************ 00:08:45.136 13:21:03 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:08:45.136 13:21:03 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:08:45.136 13:21:03 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:45.136 13:21:03 -- common/autotest_common.sh@10 -- # set +x 00:08:45.136 ************************************ 00:08:45.136 START TEST accel_xor 00:08:45.136 ************************************ 00:08:45.136 13:21:03 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y -x 3 00:08:45.136 13:21:03 -- accel/accel.sh@16 -- # local accel_opc 00:08:45.136 13:21:03 -- accel/accel.sh@17 -- # local accel_module 00:08:45.136 13:21:03 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:08:45.136 13:21:03 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:08:45.136 13:21:03 -- accel/accel.sh@12 -- # build_accel_config 00:08:45.136 13:21:03 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:45.136 13:21:03 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:45.136 13:21:03 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:45.136 13:21:03 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:45.136 13:21:03 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:45.136 13:21:03 -- accel/accel.sh@41 -- # local IFS=, 00:08:45.136 13:21:03 -- accel/accel.sh@42 -- # jq -r . 00:08:45.136 [2024-07-24 13:21:03.801384] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:45.136 [2024-07-24 13:21:03.801476] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3161539 ] 00:08:45.136 EAL: No free 2048 kB hugepages reported on node 1 00:08:45.136 [2024-07-24 13:21:03.922931] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:45.136 [2024-07-24 13:21:03.970008] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:46.512 13:21:05 -- accel/accel.sh@18 -- # out=' 00:08:46.512 SPDK Configuration: 00:08:46.512 Core mask: 0x1 00:08:46.512 00:08:46.512 Accel Perf Configuration: 00:08:46.512 Workload Type: xor 00:08:46.512 Source buffers: 3 00:08:46.512 Transfer size: 4096 bytes 00:08:46.512 Vector count 1 00:08:46.512 Module: software 00:08:46.512 Queue depth: 32 00:08:46.512 Allocate depth: 32 00:08:46.512 # threads/core: 1 00:08:46.512 Run time: 1 seconds 00:08:46.512 Verify: Yes 00:08:46.512 00:08:46.512 Running for 1 seconds... 00:08:46.512 00:08:46.512 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:46.512 ------------------------------------------------------------------------------------ 00:08:46.512 0,0 414752/s 1620 MiB/s 0 0 00:08:46.512 ==================================================================================== 00:08:46.512 Total 414752/s 1620 MiB/s 0 0' 00:08:46.512 13:21:05 -- accel/accel.sh@20 -- # IFS=: 00:08:46.512 13:21:05 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:08:46.512 13:21:05 -- accel/accel.sh@20 -- # read -r var val 00:08:46.512 13:21:05 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:08:46.512 13:21:05 -- accel/accel.sh@12 -- # build_accel_config 00:08:46.513 13:21:05 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:46.513 13:21:05 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:46.513 13:21:05 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:46.513 13:21:05 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:46.513 13:21:05 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:46.513 13:21:05 -- accel/accel.sh@41 -- # local IFS=, 00:08:46.513 13:21:05 -- accel/accel.sh@42 -- # jq -r . 00:08:46.513 [2024-07-24 13:21:05.163017] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:46.513 [2024-07-24 13:21:05.163075] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3161951 ] 00:08:46.513 EAL: No free 2048 kB hugepages reported on node 1 00:08:46.513 [2024-07-24 13:21:05.264603] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:46.513 [2024-07-24 13:21:05.311654] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:46.513 13:21:05 -- accel/accel.sh@21 -- # val= 00:08:46.513 13:21:05 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # IFS=: 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # read -r var val 00:08:46.513 13:21:05 -- accel/accel.sh@21 -- # val= 00:08:46.513 13:21:05 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # IFS=: 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # read -r var val 00:08:46.513 13:21:05 -- accel/accel.sh@21 -- # val=0x1 00:08:46.513 13:21:05 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # IFS=: 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # read -r var val 00:08:46.513 13:21:05 -- accel/accel.sh@21 -- # val= 00:08:46.513 13:21:05 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # IFS=: 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # read -r var val 00:08:46.513 13:21:05 -- accel/accel.sh@21 -- # val= 00:08:46.513 13:21:05 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # IFS=: 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # read -r var val 00:08:46.513 13:21:05 -- accel/accel.sh@21 -- # val=xor 00:08:46.513 13:21:05 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.513 13:21:05 -- accel/accel.sh@24 -- # accel_opc=xor 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # IFS=: 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # read -r var val 00:08:46.513 13:21:05 -- accel/accel.sh@21 -- # val=3 00:08:46.513 13:21:05 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # IFS=: 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # read -r var val 00:08:46.513 13:21:05 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:46.513 13:21:05 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # IFS=: 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # read -r var val 00:08:46.513 13:21:05 -- accel/accel.sh@21 -- # val= 00:08:46.513 13:21:05 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # IFS=: 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # read -r var val 00:08:46.513 13:21:05 -- accel/accel.sh@21 -- # val=software 00:08:46.513 13:21:05 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.513 13:21:05 -- accel/accel.sh@23 -- # accel_module=software 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # IFS=: 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # read -r var val 00:08:46.513 13:21:05 -- accel/accel.sh@21 -- # val=32 00:08:46.513 13:21:05 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # IFS=: 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # read -r var val 00:08:46.513 13:21:05 -- accel/accel.sh@21 -- # val=32 00:08:46.513 13:21:05 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # IFS=: 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # read -r var val 00:08:46.513 13:21:05 -- accel/accel.sh@21 -- # val=1 00:08:46.513 13:21:05 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # IFS=: 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # read -r var val 00:08:46.513 13:21:05 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:46.513 13:21:05 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # IFS=: 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # read -r var val 00:08:46.513 13:21:05 -- accel/accel.sh@21 -- # val=Yes 00:08:46.513 13:21:05 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # IFS=: 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # read -r var val 00:08:46.513 13:21:05 -- accel/accel.sh@21 -- # val= 00:08:46.513 13:21:05 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # IFS=: 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # read -r var val 00:08:46.513 13:21:05 -- accel/accel.sh@21 -- # val= 00:08:46.513 13:21:05 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # IFS=: 00:08:46.513 13:21:05 -- accel/accel.sh@20 -- # read -r var val 00:08:47.935 13:21:06 -- accel/accel.sh@21 -- # val= 00:08:47.935 13:21:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:47.935 13:21:06 -- accel/accel.sh@20 -- # IFS=: 00:08:47.935 13:21:06 -- accel/accel.sh@20 -- # read -r var val 00:08:47.935 13:21:06 -- accel/accel.sh@21 -- # val= 00:08:47.935 13:21:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:47.935 13:21:06 -- accel/accel.sh@20 -- # IFS=: 00:08:47.935 13:21:06 -- accel/accel.sh@20 -- # read -r var val 00:08:47.935 13:21:06 -- accel/accel.sh@21 -- # val= 00:08:47.935 13:21:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:47.935 13:21:06 -- accel/accel.sh@20 -- # IFS=: 00:08:47.935 13:21:06 -- accel/accel.sh@20 -- # read -r var val 00:08:47.935 13:21:06 -- accel/accel.sh@21 -- # val= 00:08:47.935 13:21:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:47.935 13:21:06 -- accel/accel.sh@20 -- # IFS=: 00:08:47.935 13:21:06 -- accel/accel.sh@20 -- # read -r var val 00:08:47.935 13:21:06 -- accel/accel.sh@21 -- # val= 00:08:47.935 13:21:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:47.935 13:21:06 -- accel/accel.sh@20 -- # IFS=: 00:08:47.935 13:21:06 -- accel/accel.sh@20 -- # read -r var val 00:08:47.935 13:21:06 -- accel/accel.sh@21 -- # val= 00:08:47.935 13:21:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:47.935 13:21:06 -- accel/accel.sh@20 -- # IFS=: 00:08:47.935 13:21:06 -- accel/accel.sh@20 -- # read -r var val 00:08:47.935 13:21:06 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:47.935 13:21:06 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:08:47.935 13:21:06 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:47.935 00:08:47.935 real 0m2.716s 00:08:47.935 user 0m2.353s 00:08:47.935 sys 0m0.355s 00:08:47.935 13:21:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:47.935 13:21:06 -- common/autotest_common.sh@10 -- # set +x 00:08:47.935 ************************************ 00:08:47.935 END TEST accel_xor 00:08:47.935 ************************************ 00:08:47.935 13:21:06 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:08:47.935 13:21:06 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:08:47.935 13:21:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:47.935 13:21:06 -- common/autotest_common.sh@10 -- # set +x 00:08:47.935 ************************************ 00:08:47.935 START TEST accel_dif_verify 00:08:47.935 ************************************ 00:08:47.935 13:21:06 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_verify 00:08:47.935 13:21:06 -- accel/accel.sh@16 -- # local accel_opc 00:08:47.935 13:21:06 -- accel/accel.sh@17 -- # local accel_module 00:08:47.935 13:21:06 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:08:47.935 13:21:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:08:47.935 13:21:06 -- accel/accel.sh@12 -- # build_accel_config 00:08:47.935 13:21:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:47.935 13:21:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:47.935 13:21:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:47.935 13:21:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:47.935 13:21:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:47.935 13:21:06 -- accel/accel.sh@41 -- # local IFS=, 00:08:47.935 13:21:06 -- accel/accel.sh@42 -- # jq -r . 00:08:47.935 [2024-07-24 13:21:06.554486] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:47.935 [2024-07-24 13:21:06.554579] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3162309 ] 00:08:47.935 EAL: No free 2048 kB hugepages reported on node 1 00:08:47.935 [2024-07-24 13:21:06.674473] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:47.935 [2024-07-24 13:21:06.718113] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:49.312 13:21:07 -- accel/accel.sh@18 -- # out=' 00:08:49.312 SPDK Configuration: 00:08:49.312 Core mask: 0x1 00:08:49.312 00:08:49.312 Accel Perf Configuration: 00:08:49.312 Workload Type: dif_verify 00:08:49.312 Vector size: 4096 bytes 00:08:49.312 Transfer size: 4096 bytes 00:08:49.312 Block size: 512 bytes 00:08:49.312 Metadata size: 8 bytes 00:08:49.312 Vector count 1 00:08:49.312 Module: software 00:08:49.312 Queue depth: 32 00:08:49.312 Allocate depth: 32 00:08:49.312 # threads/core: 1 00:08:49.312 Run time: 1 seconds 00:08:49.312 Verify: No 00:08:49.312 00:08:49.312 Running for 1 seconds... 00:08:49.312 00:08:49.312 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:49.312 ------------------------------------------------------------------------------------ 00:08:49.312 0,0 146368/s 580 MiB/s 0 0 00:08:49.312 ==================================================================================== 00:08:49.312 Total 146368/s 571 MiB/s 0 0' 00:08:49.312 13:21:07 -- accel/accel.sh@20 -- # IFS=: 00:08:49.312 13:21:07 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:08:49.312 13:21:07 -- accel/accel.sh@20 -- # read -r var val 00:08:49.312 13:21:07 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:08:49.312 13:21:07 -- accel/accel.sh@12 -- # build_accel_config 00:08:49.312 13:21:07 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:49.312 13:21:07 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:49.312 13:21:07 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:49.312 13:21:07 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:49.312 13:21:07 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:49.312 13:21:07 -- accel/accel.sh@41 -- # local IFS=, 00:08:49.312 13:21:07 -- accel/accel.sh@42 -- # jq -r . 00:08:49.312 [2024-07-24 13:21:07.916066] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:49.313 [2024-07-24 13:21:07.916126] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3162495 ] 00:08:49.313 EAL: No free 2048 kB hugepages reported on node 1 00:08:49.313 [2024-07-24 13:21:08.017793] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:49.313 [2024-07-24 13:21:08.065050] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:49.313 13:21:08 -- accel/accel.sh@21 -- # val= 00:08:49.313 13:21:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # IFS=: 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # read -r var val 00:08:49.313 13:21:08 -- accel/accel.sh@21 -- # val= 00:08:49.313 13:21:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # IFS=: 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # read -r var val 00:08:49.313 13:21:08 -- accel/accel.sh@21 -- # val=0x1 00:08:49.313 13:21:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # IFS=: 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # read -r var val 00:08:49.313 13:21:08 -- accel/accel.sh@21 -- # val= 00:08:49.313 13:21:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # IFS=: 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # read -r var val 00:08:49.313 13:21:08 -- accel/accel.sh@21 -- # val= 00:08:49.313 13:21:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # IFS=: 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # read -r var val 00:08:49.313 13:21:08 -- accel/accel.sh@21 -- # val=dif_verify 00:08:49.313 13:21:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.313 13:21:08 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # IFS=: 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # read -r var val 00:08:49.313 13:21:08 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:49.313 13:21:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # IFS=: 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # read -r var val 00:08:49.313 13:21:08 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:49.313 13:21:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # IFS=: 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # read -r var val 00:08:49.313 13:21:08 -- accel/accel.sh@21 -- # val='512 bytes' 00:08:49.313 13:21:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # IFS=: 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # read -r var val 00:08:49.313 13:21:08 -- accel/accel.sh@21 -- # val='8 bytes' 00:08:49.313 13:21:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # IFS=: 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # read -r var val 00:08:49.313 13:21:08 -- accel/accel.sh@21 -- # val= 00:08:49.313 13:21:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # IFS=: 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # read -r var val 00:08:49.313 13:21:08 -- accel/accel.sh@21 -- # val=software 00:08:49.313 13:21:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.313 13:21:08 -- accel/accel.sh@23 -- # accel_module=software 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # IFS=: 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # read -r var val 00:08:49.313 13:21:08 -- accel/accel.sh@21 -- # val=32 00:08:49.313 13:21:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # IFS=: 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # read -r var val 00:08:49.313 13:21:08 -- accel/accel.sh@21 -- # val=32 00:08:49.313 13:21:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # IFS=: 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # read -r var val 00:08:49.313 13:21:08 -- accel/accel.sh@21 -- # val=1 00:08:49.313 13:21:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # IFS=: 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # read -r var val 00:08:49.313 13:21:08 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:49.313 13:21:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # IFS=: 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # read -r var val 00:08:49.313 13:21:08 -- accel/accel.sh@21 -- # val=No 00:08:49.313 13:21:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # IFS=: 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # read -r var val 00:08:49.313 13:21:08 -- accel/accel.sh@21 -- # val= 00:08:49.313 13:21:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # IFS=: 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # read -r var val 00:08:49.313 13:21:08 -- accel/accel.sh@21 -- # val= 00:08:49.313 13:21:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # IFS=: 00:08:49.313 13:21:08 -- accel/accel.sh@20 -- # read -r var val 00:08:50.688 13:21:09 -- accel/accel.sh@21 -- # val= 00:08:50.688 13:21:09 -- accel/accel.sh@22 -- # case "$var" in 00:08:50.688 13:21:09 -- accel/accel.sh@20 -- # IFS=: 00:08:50.688 13:21:09 -- accel/accel.sh@20 -- # read -r var val 00:08:50.688 13:21:09 -- accel/accel.sh@21 -- # val= 00:08:50.688 13:21:09 -- accel/accel.sh@22 -- # case "$var" in 00:08:50.688 13:21:09 -- accel/accel.sh@20 -- # IFS=: 00:08:50.688 13:21:09 -- accel/accel.sh@20 -- # read -r var val 00:08:50.688 13:21:09 -- accel/accel.sh@21 -- # val= 00:08:50.688 13:21:09 -- accel/accel.sh@22 -- # case "$var" in 00:08:50.688 13:21:09 -- accel/accel.sh@20 -- # IFS=: 00:08:50.688 13:21:09 -- accel/accel.sh@20 -- # read -r var val 00:08:50.688 13:21:09 -- accel/accel.sh@21 -- # val= 00:08:50.688 13:21:09 -- accel/accel.sh@22 -- # case "$var" in 00:08:50.688 13:21:09 -- accel/accel.sh@20 -- # IFS=: 00:08:50.688 13:21:09 -- accel/accel.sh@20 -- # read -r var val 00:08:50.688 13:21:09 -- accel/accel.sh@21 -- # val= 00:08:50.688 13:21:09 -- accel/accel.sh@22 -- # case "$var" in 00:08:50.688 13:21:09 -- accel/accel.sh@20 -- # IFS=: 00:08:50.688 13:21:09 -- accel/accel.sh@20 -- # read -r var val 00:08:50.688 13:21:09 -- accel/accel.sh@21 -- # val= 00:08:50.688 13:21:09 -- accel/accel.sh@22 -- # case "$var" in 00:08:50.688 13:21:09 -- accel/accel.sh@20 -- # IFS=: 00:08:50.688 13:21:09 -- accel/accel.sh@20 -- # read -r var val 00:08:50.688 13:21:09 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:50.688 13:21:09 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:08:50.688 13:21:09 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:50.688 00:08:50.688 real 0m2.732s 00:08:50.688 user 0m2.395s 00:08:50.688 sys 0m0.334s 00:08:50.688 13:21:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:50.688 13:21:09 -- common/autotest_common.sh@10 -- # set +x 00:08:50.688 ************************************ 00:08:50.688 END TEST accel_dif_verify 00:08:50.688 ************************************ 00:08:50.688 13:21:09 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:08:50.688 13:21:09 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:08:50.688 13:21:09 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:50.688 13:21:09 -- common/autotest_common.sh@10 -- # set +x 00:08:50.688 ************************************ 00:08:50.688 START TEST accel_dif_generate 00:08:50.688 ************************************ 00:08:50.688 13:21:09 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate 00:08:50.688 13:21:09 -- accel/accel.sh@16 -- # local accel_opc 00:08:50.688 13:21:09 -- accel/accel.sh@17 -- # local accel_module 00:08:50.688 13:21:09 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:08:50.688 13:21:09 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:08:50.688 13:21:09 -- accel/accel.sh@12 -- # build_accel_config 00:08:50.688 13:21:09 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:50.688 13:21:09 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:50.688 13:21:09 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:50.688 13:21:09 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:50.688 13:21:09 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:50.688 13:21:09 -- accel/accel.sh@41 -- # local IFS=, 00:08:50.688 13:21:09 -- accel/accel.sh@42 -- # jq -r . 00:08:50.688 [2024-07-24 13:21:09.331233] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:50.688 [2024-07-24 13:21:09.331334] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3162697 ] 00:08:50.688 EAL: No free 2048 kB hugepages reported on node 1 00:08:50.688 [2024-07-24 13:21:09.449750] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:50.688 [2024-07-24 13:21:09.496997] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:52.063 13:21:10 -- accel/accel.sh@18 -- # out=' 00:08:52.063 SPDK Configuration: 00:08:52.063 Core mask: 0x1 00:08:52.063 00:08:52.063 Accel Perf Configuration: 00:08:52.063 Workload Type: dif_generate 00:08:52.063 Vector size: 4096 bytes 00:08:52.063 Transfer size: 4096 bytes 00:08:52.063 Block size: 512 bytes 00:08:52.063 Metadata size: 8 bytes 00:08:52.063 Vector count 1 00:08:52.063 Module: software 00:08:52.063 Queue depth: 32 00:08:52.063 Allocate depth: 32 00:08:52.063 # threads/core: 1 00:08:52.063 Run time: 1 seconds 00:08:52.063 Verify: No 00:08:52.063 00:08:52.063 Running for 1 seconds... 00:08:52.063 00:08:52.063 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:52.063 ------------------------------------------------------------------------------------ 00:08:52.063 0,0 178368/s 707 MiB/s 0 0 00:08:52.063 ==================================================================================== 00:08:52.063 Total 178368/s 696 MiB/s 0 0' 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # IFS=: 00:08:52.063 13:21:10 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # read -r var val 00:08:52.063 13:21:10 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:08:52.063 13:21:10 -- accel/accel.sh@12 -- # build_accel_config 00:08:52.063 13:21:10 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:52.063 13:21:10 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:52.063 13:21:10 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:52.063 13:21:10 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:52.063 13:21:10 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:52.063 13:21:10 -- accel/accel.sh@41 -- # local IFS=, 00:08:52.063 13:21:10 -- accel/accel.sh@42 -- # jq -r . 00:08:52.063 [2024-07-24 13:21:10.704151] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:52.063 [2024-07-24 13:21:10.704207] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3162891 ] 00:08:52.063 EAL: No free 2048 kB hugepages reported on node 1 00:08:52.063 [2024-07-24 13:21:10.803011] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:52.063 [2024-07-24 13:21:10.850230] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:52.063 13:21:10 -- accel/accel.sh@21 -- # val= 00:08:52.063 13:21:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # IFS=: 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # read -r var val 00:08:52.063 13:21:10 -- accel/accel.sh@21 -- # val= 00:08:52.063 13:21:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # IFS=: 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # read -r var val 00:08:52.063 13:21:10 -- accel/accel.sh@21 -- # val=0x1 00:08:52.063 13:21:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # IFS=: 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # read -r var val 00:08:52.063 13:21:10 -- accel/accel.sh@21 -- # val= 00:08:52.063 13:21:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # IFS=: 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # read -r var val 00:08:52.063 13:21:10 -- accel/accel.sh@21 -- # val= 00:08:52.063 13:21:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # IFS=: 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # read -r var val 00:08:52.063 13:21:10 -- accel/accel.sh@21 -- # val=dif_generate 00:08:52.063 13:21:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:52.063 13:21:10 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # IFS=: 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # read -r var val 00:08:52.063 13:21:10 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:52.063 13:21:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # IFS=: 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # read -r var val 00:08:52.063 13:21:10 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:52.063 13:21:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # IFS=: 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # read -r var val 00:08:52.063 13:21:10 -- accel/accel.sh@21 -- # val='512 bytes' 00:08:52.063 13:21:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # IFS=: 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # read -r var val 00:08:52.063 13:21:10 -- accel/accel.sh@21 -- # val='8 bytes' 00:08:52.063 13:21:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # IFS=: 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # read -r var val 00:08:52.063 13:21:10 -- accel/accel.sh@21 -- # val= 00:08:52.063 13:21:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # IFS=: 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # read -r var val 00:08:52.063 13:21:10 -- accel/accel.sh@21 -- # val=software 00:08:52.063 13:21:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:52.063 13:21:10 -- accel/accel.sh@23 -- # accel_module=software 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # IFS=: 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # read -r var val 00:08:52.063 13:21:10 -- accel/accel.sh@21 -- # val=32 00:08:52.063 13:21:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # IFS=: 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # read -r var val 00:08:52.063 13:21:10 -- accel/accel.sh@21 -- # val=32 00:08:52.063 13:21:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # IFS=: 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # read -r var val 00:08:52.063 13:21:10 -- accel/accel.sh@21 -- # val=1 00:08:52.063 13:21:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # IFS=: 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # read -r var val 00:08:52.063 13:21:10 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:52.063 13:21:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # IFS=: 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # read -r var val 00:08:52.063 13:21:10 -- accel/accel.sh@21 -- # val=No 00:08:52.063 13:21:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # IFS=: 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # read -r var val 00:08:52.063 13:21:10 -- accel/accel.sh@21 -- # val= 00:08:52.063 13:21:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # IFS=: 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # read -r var val 00:08:52.063 13:21:10 -- accel/accel.sh@21 -- # val= 00:08:52.063 13:21:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # IFS=: 00:08:52.063 13:21:10 -- accel/accel.sh@20 -- # read -r var val 00:08:53.438 13:21:12 -- accel/accel.sh@21 -- # val= 00:08:53.438 13:21:12 -- accel/accel.sh@22 -- # case "$var" in 00:08:53.438 13:21:12 -- accel/accel.sh@20 -- # IFS=: 00:08:53.438 13:21:12 -- accel/accel.sh@20 -- # read -r var val 00:08:53.438 13:21:12 -- accel/accel.sh@21 -- # val= 00:08:53.438 13:21:12 -- accel/accel.sh@22 -- # case "$var" in 00:08:53.438 13:21:12 -- accel/accel.sh@20 -- # IFS=: 00:08:53.438 13:21:12 -- accel/accel.sh@20 -- # read -r var val 00:08:53.438 13:21:12 -- accel/accel.sh@21 -- # val= 00:08:53.438 13:21:12 -- accel/accel.sh@22 -- # case "$var" in 00:08:53.438 13:21:12 -- accel/accel.sh@20 -- # IFS=: 00:08:53.438 13:21:12 -- accel/accel.sh@20 -- # read -r var val 00:08:53.438 13:21:12 -- accel/accel.sh@21 -- # val= 00:08:53.438 13:21:12 -- accel/accel.sh@22 -- # case "$var" in 00:08:53.438 13:21:12 -- accel/accel.sh@20 -- # IFS=: 00:08:53.438 13:21:12 -- accel/accel.sh@20 -- # read -r var val 00:08:53.438 13:21:12 -- accel/accel.sh@21 -- # val= 00:08:53.438 13:21:12 -- accel/accel.sh@22 -- # case "$var" in 00:08:53.438 13:21:12 -- accel/accel.sh@20 -- # IFS=: 00:08:53.438 13:21:12 -- accel/accel.sh@20 -- # read -r var val 00:08:53.438 13:21:12 -- accel/accel.sh@21 -- # val= 00:08:53.438 13:21:12 -- accel/accel.sh@22 -- # case "$var" in 00:08:53.438 13:21:12 -- accel/accel.sh@20 -- # IFS=: 00:08:53.438 13:21:12 -- accel/accel.sh@20 -- # read -r var val 00:08:53.438 13:21:12 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:53.438 13:21:12 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:08:53.438 13:21:12 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:53.438 00:08:53.438 real 0m2.740s 00:08:53.438 user 0m2.393s 00:08:53.438 sys 0m0.344s 00:08:53.438 13:21:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:53.438 13:21:12 -- common/autotest_common.sh@10 -- # set +x 00:08:53.438 ************************************ 00:08:53.438 END TEST accel_dif_generate 00:08:53.438 ************************************ 00:08:53.438 13:21:12 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:08:53.438 13:21:12 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:08:53.438 13:21:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:53.438 13:21:12 -- common/autotest_common.sh@10 -- # set +x 00:08:53.438 ************************************ 00:08:53.438 START TEST accel_dif_generate_copy 00:08:53.438 ************************************ 00:08:53.438 13:21:12 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate_copy 00:08:53.438 13:21:12 -- accel/accel.sh@16 -- # local accel_opc 00:08:53.438 13:21:12 -- accel/accel.sh@17 -- # local accel_module 00:08:53.438 13:21:12 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:08:53.438 13:21:12 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:08:53.438 13:21:12 -- accel/accel.sh@12 -- # build_accel_config 00:08:53.438 13:21:12 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:53.438 13:21:12 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:53.438 13:21:12 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:53.438 13:21:12 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:53.438 13:21:12 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:53.438 13:21:12 -- accel/accel.sh@41 -- # local IFS=, 00:08:53.438 13:21:12 -- accel/accel.sh@42 -- # jq -r . 00:08:53.438 [2024-07-24 13:21:12.112560] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:53.438 [2024-07-24 13:21:12.112653] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3163084 ] 00:08:53.438 EAL: No free 2048 kB hugepages reported on node 1 00:08:53.438 [2024-07-24 13:21:12.232947] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:53.438 [2024-07-24 13:21:12.280232] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:54.813 13:21:13 -- accel/accel.sh@18 -- # out=' 00:08:54.813 SPDK Configuration: 00:08:54.813 Core mask: 0x1 00:08:54.813 00:08:54.813 Accel Perf Configuration: 00:08:54.813 Workload Type: dif_generate_copy 00:08:54.813 Vector size: 4096 bytes 00:08:54.813 Transfer size: 4096 bytes 00:08:54.813 Vector count 1 00:08:54.813 Module: software 00:08:54.813 Queue depth: 32 00:08:54.813 Allocate depth: 32 00:08:54.813 # threads/core: 1 00:08:54.813 Run time: 1 seconds 00:08:54.813 Verify: No 00:08:54.813 00:08:54.813 Running for 1 seconds... 00:08:54.813 00:08:54.813 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:54.813 ------------------------------------------------------------------------------------ 00:08:54.813 0,0 137920/s 547 MiB/s 0 0 00:08:54.813 ==================================================================================== 00:08:54.813 Total 137920/s 538 MiB/s 0 0' 00:08:54.813 13:21:13 -- accel/accel.sh@20 -- # IFS=: 00:08:54.813 13:21:13 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:08:54.813 13:21:13 -- accel/accel.sh@20 -- # read -r var val 00:08:54.813 13:21:13 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:08:54.813 13:21:13 -- accel/accel.sh@12 -- # build_accel_config 00:08:54.813 13:21:13 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:54.813 13:21:13 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:54.813 13:21:13 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:54.813 13:21:13 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:54.813 13:21:13 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:54.813 13:21:13 -- accel/accel.sh@41 -- # local IFS=, 00:08:54.813 13:21:13 -- accel/accel.sh@42 -- # jq -r . 00:08:54.813 [2024-07-24 13:21:13.497917] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:54.813 [2024-07-24 13:21:13.498041] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3163270 ] 00:08:54.813 EAL: No free 2048 kB hugepages reported on node 1 00:08:54.813 [2024-07-24 13:21:13.617557] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:54.813 [2024-07-24 13:21:13.664757] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:55.072 13:21:13 -- accel/accel.sh@21 -- # val= 00:08:55.072 13:21:13 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # IFS=: 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # read -r var val 00:08:55.072 13:21:13 -- accel/accel.sh@21 -- # val= 00:08:55.072 13:21:13 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # IFS=: 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # read -r var val 00:08:55.072 13:21:13 -- accel/accel.sh@21 -- # val=0x1 00:08:55.072 13:21:13 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # IFS=: 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # read -r var val 00:08:55.072 13:21:13 -- accel/accel.sh@21 -- # val= 00:08:55.072 13:21:13 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # IFS=: 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # read -r var val 00:08:55.072 13:21:13 -- accel/accel.sh@21 -- # val= 00:08:55.072 13:21:13 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # IFS=: 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # read -r var val 00:08:55.072 13:21:13 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:08:55.072 13:21:13 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.072 13:21:13 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # IFS=: 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # read -r var val 00:08:55.072 13:21:13 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:55.072 13:21:13 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # IFS=: 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # read -r var val 00:08:55.072 13:21:13 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:55.072 13:21:13 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # IFS=: 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # read -r var val 00:08:55.072 13:21:13 -- accel/accel.sh@21 -- # val= 00:08:55.072 13:21:13 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # IFS=: 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # read -r var val 00:08:55.072 13:21:13 -- accel/accel.sh@21 -- # val=software 00:08:55.072 13:21:13 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.072 13:21:13 -- accel/accel.sh@23 -- # accel_module=software 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # IFS=: 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # read -r var val 00:08:55.072 13:21:13 -- accel/accel.sh@21 -- # val=32 00:08:55.072 13:21:13 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # IFS=: 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # read -r var val 00:08:55.072 13:21:13 -- accel/accel.sh@21 -- # val=32 00:08:55.072 13:21:13 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # IFS=: 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # read -r var val 00:08:55.072 13:21:13 -- accel/accel.sh@21 -- # val=1 00:08:55.072 13:21:13 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # IFS=: 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # read -r var val 00:08:55.072 13:21:13 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:55.072 13:21:13 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # IFS=: 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # read -r var val 00:08:55.072 13:21:13 -- accel/accel.sh@21 -- # val=No 00:08:55.072 13:21:13 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # IFS=: 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # read -r var val 00:08:55.072 13:21:13 -- accel/accel.sh@21 -- # val= 00:08:55.072 13:21:13 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # IFS=: 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # read -r var val 00:08:55.072 13:21:13 -- accel/accel.sh@21 -- # val= 00:08:55.072 13:21:13 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # IFS=: 00:08:55.072 13:21:13 -- accel/accel.sh@20 -- # read -r var val 00:08:56.009 13:21:14 -- accel/accel.sh@21 -- # val= 00:08:56.009 13:21:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:56.009 13:21:14 -- accel/accel.sh@20 -- # IFS=: 00:08:56.009 13:21:14 -- accel/accel.sh@20 -- # read -r var val 00:08:56.009 13:21:14 -- accel/accel.sh@21 -- # val= 00:08:56.009 13:21:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:56.009 13:21:14 -- accel/accel.sh@20 -- # IFS=: 00:08:56.009 13:21:14 -- accel/accel.sh@20 -- # read -r var val 00:08:56.009 13:21:14 -- accel/accel.sh@21 -- # val= 00:08:56.009 13:21:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:56.009 13:21:14 -- accel/accel.sh@20 -- # IFS=: 00:08:56.009 13:21:14 -- accel/accel.sh@20 -- # read -r var val 00:08:56.009 13:21:14 -- accel/accel.sh@21 -- # val= 00:08:56.009 13:21:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:56.009 13:21:14 -- accel/accel.sh@20 -- # IFS=: 00:08:56.009 13:21:14 -- accel/accel.sh@20 -- # read -r var val 00:08:56.009 13:21:14 -- accel/accel.sh@21 -- # val= 00:08:56.009 13:21:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:56.009 13:21:14 -- accel/accel.sh@20 -- # IFS=: 00:08:56.009 13:21:14 -- accel/accel.sh@20 -- # read -r var val 00:08:56.009 13:21:14 -- accel/accel.sh@21 -- # val= 00:08:56.009 13:21:14 -- accel/accel.sh@22 -- # case "$var" in 00:08:56.009 13:21:14 -- accel/accel.sh@20 -- # IFS=: 00:08:56.009 13:21:14 -- accel/accel.sh@20 -- # read -r var val 00:08:56.009 13:21:14 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:56.010 13:21:14 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:08:56.010 13:21:14 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:56.010 00:08:56.010 real 0m2.774s 00:08:56.010 user 0m2.403s 00:08:56.010 sys 0m0.367s 00:08:56.010 13:21:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:56.010 13:21:14 -- common/autotest_common.sh@10 -- # set +x 00:08:56.010 ************************************ 00:08:56.010 END TEST accel_dif_generate_copy 00:08:56.010 ************************************ 00:08:56.269 13:21:14 -- accel/accel.sh@107 -- # [[ y == y ]] 00:08:56.269 13:21:14 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:56.269 13:21:14 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:08:56.269 13:21:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:56.269 13:21:14 -- common/autotest_common.sh@10 -- # set +x 00:08:56.269 ************************************ 00:08:56.269 START TEST accel_comp 00:08:56.269 ************************************ 00:08:56.269 13:21:14 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:56.269 13:21:14 -- accel/accel.sh@16 -- # local accel_opc 00:08:56.269 13:21:14 -- accel/accel.sh@17 -- # local accel_module 00:08:56.269 13:21:14 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:56.269 13:21:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:56.269 13:21:14 -- accel/accel.sh@12 -- # build_accel_config 00:08:56.269 13:21:14 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:56.269 13:21:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:56.269 13:21:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:56.269 13:21:14 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:56.269 13:21:14 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:56.269 13:21:14 -- accel/accel.sh@41 -- # local IFS=, 00:08:56.269 13:21:14 -- accel/accel.sh@42 -- # jq -r . 00:08:56.269 [2024-07-24 13:21:14.932976] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:56.269 [2024-07-24 13:21:14.933079] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3163466 ] 00:08:56.269 EAL: No free 2048 kB hugepages reported on node 1 00:08:56.269 [2024-07-24 13:21:15.040503] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:56.269 [2024-07-24 13:21:15.091376] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:57.646 13:21:16 -- accel/accel.sh@18 -- # out='Preparing input file... 00:08:57.646 00:08:57.646 SPDK Configuration: 00:08:57.646 Core mask: 0x1 00:08:57.646 00:08:57.646 Accel Perf Configuration: 00:08:57.646 Workload Type: compress 00:08:57.646 Transfer size: 4096 bytes 00:08:57.646 Vector count 1 00:08:57.646 Module: software 00:08:57.646 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:57.646 Queue depth: 32 00:08:57.646 Allocate depth: 32 00:08:57.646 # threads/core: 1 00:08:57.646 Run time: 1 seconds 00:08:57.646 Verify: No 00:08:57.646 00:08:57.646 Running for 1 seconds... 00:08:57.646 00:08:57.646 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:57.646 ------------------------------------------------------------------------------------ 00:08:57.646 0,0 43968/s 183 MiB/s 0 0 00:08:57.646 ==================================================================================== 00:08:57.646 Total 43968/s 171 MiB/s 0 0' 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # IFS=: 00:08:57.646 13:21:16 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # read -r var val 00:08:57.646 13:21:16 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:57.646 13:21:16 -- accel/accel.sh@12 -- # build_accel_config 00:08:57.646 13:21:16 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:57.646 13:21:16 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:57.646 13:21:16 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:57.646 13:21:16 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:57.646 13:21:16 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:57.646 13:21:16 -- accel/accel.sh@41 -- # local IFS=, 00:08:57.646 13:21:16 -- accel/accel.sh@42 -- # jq -r . 00:08:57.646 [2024-07-24 13:21:16.302576] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:57.646 [2024-07-24 13:21:16.302635] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3163646 ] 00:08:57.646 EAL: No free 2048 kB hugepages reported on node 1 00:08:57.646 [2024-07-24 13:21:16.401969] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:57.646 [2024-07-24 13:21:16.448826] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:57.646 13:21:16 -- accel/accel.sh@21 -- # val= 00:08:57.646 13:21:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # IFS=: 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # read -r var val 00:08:57.646 13:21:16 -- accel/accel.sh@21 -- # val= 00:08:57.646 13:21:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # IFS=: 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # read -r var val 00:08:57.646 13:21:16 -- accel/accel.sh@21 -- # val= 00:08:57.646 13:21:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # IFS=: 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # read -r var val 00:08:57.646 13:21:16 -- accel/accel.sh@21 -- # val=0x1 00:08:57.646 13:21:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # IFS=: 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # read -r var val 00:08:57.646 13:21:16 -- accel/accel.sh@21 -- # val= 00:08:57.646 13:21:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # IFS=: 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # read -r var val 00:08:57.646 13:21:16 -- accel/accel.sh@21 -- # val= 00:08:57.646 13:21:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # IFS=: 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # read -r var val 00:08:57.646 13:21:16 -- accel/accel.sh@21 -- # val=compress 00:08:57.646 13:21:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.646 13:21:16 -- accel/accel.sh@24 -- # accel_opc=compress 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # IFS=: 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # read -r var val 00:08:57.646 13:21:16 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:57.646 13:21:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # IFS=: 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # read -r var val 00:08:57.646 13:21:16 -- accel/accel.sh@21 -- # val= 00:08:57.646 13:21:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # IFS=: 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # read -r var val 00:08:57.646 13:21:16 -- accel/accel.sh@21 -- # val=software 00:08:57.646 13:21:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.646 13:21:16 -- accel/accel.sh@23 -- # accel_module=software 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # IFS=: 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # read -r var val 00:08:57.646 13:21:16 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:57.646 13:21:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # IFS=: 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # read -r var val 00:08:57.646 13:21:16 -- accel/accel.sh@21 -- # val=32 00:08:57.646 13:21:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # IFS=: 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # read -r var val 00:08:57.646 13:21:16 -- accel/accel.sh@21 -- # val=32 00:08:57.646 13:21:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # IFS=: 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # read -r var val 00:08:57.646 13:21:16 -- accel/accel.sh@21 -- # val=1 00:08:57.646 13:21:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # IFS=: 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # read -r var val 00:08:57.646 13:21:16 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:57.646 13:21:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # IFS=: 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # read -r var val 00:08:57.646 13:21:16 -- accel/accel.sh@21 -- # val=No 00:08:57.646 13:21:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # IFS=: 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # read -r var val 00:08:57.646 13:21:16 -- accel/accel.sh@21 -- # val= 00:08:57.646 13:21:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.646 13:21:16 -- accel/accel.sh@20 -- # IFS=: 00:08:57.905 13:21:16 -- accel/accel.sh@20 -- # read -r var val 00:08:57.905 13:21:16 -- accel/accel.sh@21 -- # val= 00:08:57.905 13:21:16 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.905 13:21:16 -- accel/accel.sh@20 -- # IFS=: 00:08:57.905 13:21:16 -- accel/accel.sh@20 -- # read -r var val 00:08:58.842 13:21:17 -- accel/accel.sh@21 -- # val= 00:08:58.842 13:21:17 -- accel/accel.sh@22 -- # case "$var" in 00:08:58.842 13:21:17 -- accel/accel.sh@20 -- # IFS=: 00:08:58.842 13:21:17 -- accel/accel.sh@20 -- # read -r var val 00:08:58.842 13:21:17 -- accel/accel.sh@21 -- # val= 00:08:58.842 13:21:17 -- accel/accel.sh@22 -- # case "$var" in 00:08:58.842 13:21:17 -- accel/accel.sh@20 -- # IFS=: 00:08:58.842 13:21:17 -- accel/accel.sh@20 -- # read -r var val 00:08:58.842 13:21:17 -- accel/accel.sh@21 -- # val= 00:08:58.842 13:21:17 -- accel/accel.sh@22 -- # case "$var" in 00:08:58.842 13:21:17 -- accel/accel.sh@20 -- # IFS=: 00:08:58.842 13:21:17 -- accel/accel.sh@20 -- # read -r var val 00:08:58.842 13:21:17 -- accel/accel.sh@21 -- # val= 00:08:58.842 13:21:17 -- accel/accel.sh@22 -- # case "$var" in 00:08:58.842 13:21:17 -- accel/accel.sh@20 -- # IFS=: 00:08:58.842 13:21:17 -- accel/accel.sh@20 -- # read -r var val 00:08:58.842 13:21:17 -- accel/accel.sh@21 -- # val= 00:08:58.842 13:21:17 -- accel/accel.sh@22 -- # case "$var" in 00:08:58.842 13:21:17 -- accel/accel.sh@20 -- # IFS=: 00:08:58.842 13:21:17 -- accel/accel.sh@20 -- # read -r var val 00:08:58.842 13:21:17 -- accel/accel.sh@21 -- # val= 00:08:58.842 13:21:17 -- accel/accel.sh@22 -- # case "$var" in 00:08:58.842 13:21:17 -- accel/accel.sh@20 -- # IFS=: 00:08:58.842 13:21:17 -- accel/accel.sh@20 -- # read -r var val 00:08:58.842 13:21:17 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:58.842 13:21:17 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:08:58.842 13:21:17 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:58.842 00:08:58.842 real 0m2.740s 00:08:58.842 user 0m2.403s 00:08:58.842 sys 0m0.335s 00:08:58.842 13:21:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:58.842 13:21:17 -- common/autotest_common.sh@10 -- # set +x 00:08:58.842 ************************************ 00:08:58.842 END TEST accel_comp 00:08:58.842 ************************************ 00:08:58.842 13:21:17 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:58.842 13:21:17 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:08:58.842 13:21:17 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:58.842 13:21:17 -- common/autotest_common.sh@10 -- # set +x 00:08:58.842 ************************************ 00:08:58.842 START TEST accel_decomp 00:08:58.842 ************************************ 00:08:58.842 13:21:17 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:58.842 13:21:17 -- accel/accel.sh@16 -- # local accel_opc 00:08:58.842 13:21:17 -- accel/accel.sh@17 -- # local accel_module 00:08:58.842 13:21:17 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:58.842 13:21:17 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:58.842 13:21:17 -- accel/accel.sh@12 -- # build_accel_config 00:08:58.842 13:21:17 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:58.842 13:21:17 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:58.842 13:21:17 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:58.842 13:21:17 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:58.842 13:21:17 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:58.842 13:21:17 -- accel/accel.sh@41 -- # local IFS=, 00:08:58.842 13:21:17 -- accel/accel.sh@42 -- # jq -r . 00:08:59.101 [2024-07-24 13:21:17.714446] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:08:59.101 [2024-07-24 13:21:17.714543] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3163845 ] 00:08:59.101 EAL: No free 2048 kB hugepages reported on node 1 00:08:59.101 [2024-07-24 13:21:17.834315] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:59.101 [2024-07-24 13:21:17.882454] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:00.478 13:21:19 -- accel/accel.sh@18 -- # out='Preparing input file... 00:09:00.478 00:09:00.478 SPDK Configuration: 00:09:00.478 Core mask: 0x1 00:09:00.478 00:09:00.478 Accel Perf Configuration: 00:09:00.478 Workload Type: decompress 00:09:00.478 Transfer size: 4096 bytes 00:09:00.478 Vector count 1 00:09:00.478 Module: software 00:09:00.478 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:00.478 Queue depth: 32 00:09:00.478 Allocate depth: 32 00:09:00.478 # threads/core: 1 00:09:00.478 Run time: 1 seconds 00:09:00.478 Verify: Yes 00:09:00.478 00:09:00.478 Running for 1 seconds... 00:09:00.478 00:09:00.478 Core,Thread Transfers Bandwidth Failed Miscompares 00:09:00.478 ------------------------------------------------------------------------------------ 00:09:00.478 0,0 61024/s 112 MiB/s 0 0 00:09:00.478 ==================================================================================== 00:09:00.478 Total 61024/s 238 MiB/s 0 0' 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # IFS=: 00:09:00.478 13:21:19 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # read -r var val 00:09:00.478 13:21:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:09:00.478 13:21:19 -- accel/accel.sh@12 -- # build_accel_config 00:09:00.478 13:21:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:00.478 13:21:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:00.478 13:21:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:00.478 13:21:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:00.478 13:21:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:00.478 13:21:19 -- accel/accel.sh@41 -- # local IFS=, 00:09:00.478 13:21:19 -- accel/accel.sh@42 -- # jq -r . 00:09:00.478 [2024-07-24 13:21:19.103054] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:00.478 [2024-07-24 13:21:19.103150] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3164028 ] 00:09:00.478 EAL: No free 2048 kB hugepages reported on node 1 00:09:00.478 [2024-07-24 13:21:19.225459] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:00.478 [2024-07-24 13:21:19.272881] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:00.478 13:21:19 -- accel/accel.sh@21 -- # val= 00:09:00.478 13:21:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # IFS=: 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # read -r var val 00:09:00.478 13:21:19 -- accel/accel.sh@21 -- # val= 00:09:00.478 13:21:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # IFS=: 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # read -r var val 00:09:00.478 13:21:19 -- accel/accel.sh@21 -- # val= 00:09:00.478 13:21:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # IFS=: 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # read -r var val 00:09:00.478 13:21:19 -- accel/accel.sh@21 -- # val=0x1 00:09:00.478 13:21:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # IFS=: 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # read -r var val 00:09:00.478 13:21:19 -- accel/accel.sh@21 -- # val= 00:09:00.478 13:21:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # IFS=: 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # read -r var val 00:09:00.478 13:21:19 -- accel/accel.sh@21 -- # val= 00:09:00.478 13:21:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # IFS=: 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # read -r var val 00:09:00.478 13:21:19 -- accel/accel.sh@21 -- # val=decompress 00:09:00.478 13:21:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.478 13:21:19 -- accel/accel.sh@24 -- # accel_opc=decompress 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # IFS=: 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # read -r var val 00:09:00.478 13:21:19 -- accel/accel.sh@21 -- # val='4096 bytes' 00:09:00.478 13:21:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # IFS=: 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # read -r var val 00:09:00.478 13:21:19 -- accel/accel.sh@21 -- # val= 00:09:00.478 13:21:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # IFS=: 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # read -r var val 00:09:00.478 13:21:19 -- accel/accel.sh@21 -- # val=software 00:09:00.478 13:21:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.478 13:21:19 -- accel/accel.sh@23 -- # accel_module=software 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # IFS=: 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # read -r var val 00:09:00.478 13:21:19 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:00.478 13:21:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # IFS=: 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # read -r var val 00:09:00.478 13:21:19 -- accel/accel.sh@21 -- # val=32 00:09:00.478 13:21:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # IFS=: 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # read -r var val 00:09:00.478 13:21:19 -- accel/accel.sh@21 -- # val=32 00:09:00.478 13:21:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # IFS=: 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # read -r var val 00:09:00.478 13:21:19 -- accel/accel.sh@21 -- # val=1 00:09:00.478 13:21:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # IFS=: 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # read -r var val 00:09:00.478 13:21:19 -- accel/accel.sh@21 -- # val='1 seconds' 00:09:00.478 13:21:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # IFS=: 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # read -r var val 00:09:00.478 13:21:19 -- accel/accel.sh@21 -- # val=Yes 00:09:00.478 13:21:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # IFS=: 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # read -r var val 00:09:00.478 13:21:19 -- accel/accel.sh@21 -- # val= 00:09:00.478 13:21:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # IFS=: 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # read -r var val 00:09:00.478 13:21:19 -- accel/accel.sh@21 -- # val= 00:09:00.478 13:21:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # IFS=: 00:09:00.478 13:21:19 -- accel/accel.sh@20 -- # read -r var val 00:09:01.854 13:21:20 -- accel/accel.sh@21 -- # val= 00:09:01.854 13:21:20 -- accel/accel.sh@22 -- # case "$var" in 00:09:01.854 13:21:20 -- accel/accel.sh@20 -- # IFS=: 00:09:01.854 13:21:20 -- accel/accel.sh@20 -- # read -r var val 00:09:01.854 13:21:20 -- accel/accel.sh@21 -- # val= 00:09:01.854 13:21:20 -- accel/accel.sh@22 -- # case "$var" in 00:09:01.854 13:21:20 -- accel/accel.sh@20 -- # IFS=: 00:09:01.854 13:21:20 -- accel/accel.sh@20 -- # read -r var val 00:09:01.854 13:21:20 -- accel/accel.sh@21 -- # val= 00:09:01.854 13:21:20 -- accel/accel.sh@22 -- # case "$var" in 00:09:01.854 13:21:20 -- accel/accel.sh@20 -- # IFS=: 00:09:01.854 13:21:20 -- accel/accel.sh@20 -- # read -r var val 00:09:01.854 13:21:20 -- accel/accel.sh@21 -- # val= 00:09:01.854 13:21:20 -- accel/accel.sh@22 -- # case "$var" in 00:09:01.854 13:21:20 -- accel/accel.sh@20 -- # IFS=: 00:09:01.854 13:21:20 -- accel/accel.sh@20 -- # read -r var val 00:09:01.854 13:21:20 -- accel/accel.sh@21 -- # val= 00:09:01.854 13:21:20 -- accel/accel.sh@22 -- # case "$var" in 00:09:01.854 13:21:20 -- accel/accel.sh@20 -- # IFS=: 00:09:01.854 13:21:20 -- accel/accel.sh@20 -- # read -r var val 00:09:01.854 13:21:20 -- accel/accel.sh@21 -- # val= 00:09:01.854 13:21:20 -- accel/accel.sh@22 -- # case "$var" in 00:09:01.854 13:21:20 -- accel/accel.sh@20 -- # IFS=: 00:09:01.854 13:21:20 -- accel/accel.sh@20 -- # read -r var val 00:09:01.854 13:21:20 -- accel/accel.sh@28 -- # [[ -n software ]] 00:09:01.854 13:21:20 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:09:01.854 13:21:20 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:01.854 00:09:01.854 real 0m2.780s 00:09:01.854 user 0m2.407s 00:09:01.854 sys 0m0.369s 00:09:01.854 13:21:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:01.854 13:21:20 -- common/autotest_common.sh@10 -- # set +x 00:09:01.854 ************************************ 00:09:01.854 END TEST accel_decomp 00:09:01.854 ************************************ 00:09:01.854 13:21:20 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:01.854 13:21:20 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:09:01.854 13:21:20 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:01.854 13:21:20 -- common/autotest_common.sh@10 -- # set +x 00:09:01.854 ************************************ 00:09:01.854 START TEST accel_decmop_full 00:09:01.854 ************************************ 00:09:01.854 13:21:20 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:01.854 13:21:20 -- accel/accel.sh@16 -- # local accel_opc 00:09:01.854 13:21:20 -- accel/accel.sh@17 -- # local accel_module 00:09:01.854 13:21:20 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:01.854 13:21:20 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:01.854 13:21:20 -- accel/accel.sh@12 -- # build_accel_config 00:09:01.854 13:21:20 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:01.854 13:21:20 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:01.854 13:21:20 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:01.854 13:21:20 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:01.854 13:21:20 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:01.854 13:21:20 -- accel/accel.sh@41 -- # local IFS=, 00:09:01.854 13:21:20 -- accel/accel.sh@42 -- # jq -r . 00:09:01.854 [2024-07-24 13:21:20.531663] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:01.854 [2024-07-24 13:21:20.531755] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3164226 ] 00:09:01.854 EAL: No free 2048 kB hugepages reported on node 1 00:09:01.854 [2024-07-24 13:21:20.650130] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:01.854 [2024-07-24 13:21:20.694967] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:03.231 13:21:21 -- accel/accel.sh@18 -- # out='Preparing input file... 00:09:03.231 00:09:03.231 SPDK Configuration: 00:09:03.231 Core mask: 0x1 00:09:03.231 00:09:03.231 Accel Perf Configuration: 00:09:03.231 Workload Type: decompress 00:09:03.231 Transfer size: 111250 bytes 00:09:03.231 Vector count 1 00:09:03.231 Module: software 00:09:03.231 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:03.231 Queue depth: 32 00:09:03.231 Allocate depth: 32 00:09:03.231 # threads/core: 1 00:09:03.231 Run time: 1 seconds 00:09:03.231 Verify: Yes 00:09:03.231 00:09:03.231 Running for 1 seconds... 00:09:03.231 00:09:03.231 Core,Thread Transfers Bandwidth Failed Miscompares 00:09:03.231 ------------------------------------------------------------------------------------ 00:09:03.231 0,0 3872/s 159 MiB/s 0 0 00:09:03.231 ==================================================================================== 00:09:03.231 Total 3872/s 410 MiB/s 0 0' 00:09:03.231 13:21:21 -- accel/accel.sh@20 -- # IFS=: 00:09:03.231 13:21:21 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:03.231 13:21:21 -- accel/accel.sh@20 -- # read -r var val 00:09:03.231 13:21:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:03.231 13:21:21 -- accel/accel.sh@12 -- # build_accel_config 00:09:03.231 13:21:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:03.231 13:21:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:03.231 13:21:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:03.231 13:21:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:03.231 13:21:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:03.231 13:21:21 -- accel/accel.sh@41 -- # local IFS=, 00:09:03.231 13:21:21 -- accel/accel.sh@42 -- # jq -r . 00:09:03.231 [2024-07-24 13:21:21.905918] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:03.231 [2024-07-24 13:21:21.905980] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3164423 ] 00:09:03.231 EAL: No free 2048 kB hugepages reported on node 1 00:09:03.231 [2024-07-24 13:21:22.006235] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:03.231 [2024-07-24 13:21:22.050268] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:03.231 13:21:22 -- accel/accel.sh@21 -- # val= 00:09:03.231 13:21:22 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.231 13:21:22 -- accel/accel.sh@20 -- # IFS=: 00:09:03.231 13:21:22 -- accel/accel.sh@20 -- # read -r var val 00:09:03.231 13:21:22 -- accel/accel.sh@21 -- # val= 00:09:03.231 13:21:22 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.231 13:21:22 -- accel/accel.sh@20 -- # IFS=: 00:09:03.231 13:21:22 -- accel/accel.sh@20 -- # read -r var val 00:09:03.231 13:21:22 -- accel/accel.sh@21 -- # val= 00:09:03.490 13:21:22 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # IFS=: 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # read -r var val 00:09:03.490 13:21:22 -- accel/accel.sh@21 -- # val=0x1 00:09:03.490 13:21:22 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # IFS=: 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # read -r var val 00:09:03.490 13:21:22 -- accel/accel.sh@21 -- # val= 00:09:03.490 13:21:22 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # IFS=: 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # read -r var val 00:09:03.490 13:21:22 -- accel/accel.sh@21 -- # val= 00:09:03.490 13:21:22 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # IFS=: 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # read -r var val 00:09:03.490 13:21:22 -- accel/accel.sh@21 -- # val=decompress 00:09:03.490 13:21:22 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.490 13:21:22 -- accel/accel.sh@24 -- # accel_opc=decompress 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # IFS=: 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # read -r var val 00:09:03.490 13:21:22 -- accel/accel.sh@21 -- # val='111250 bytes' 00:09:03.490 13:21:22 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # IFS=: 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # read -r var val 00:09:03.490 13:21:22 -- accel/accel.sh@21 -- # val= 00:09:03.490 13:21:22 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # IFS=: 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # read -r var val 00:09:03.490 13:21:22 -- accel/accel.sh@21 -- # val=software 00:09:03.490 13:21:22 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.490 13:21:22 -- accel/accel.sh@23 -- # accel_module=software 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # IFS=: 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # read -r var val 00:09:03.490 13:21:22 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:03.490 13:21:22 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # IFS=: 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # read -r var val 00:09:03.490 13:21:22 -- accel/accel.sh@21 -- # val=32 00:09:03.490 13:21:22 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # IFS=: 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # read -r var val 00:09:03.490 13:21:22 -- accel/accel.sh@21 -- # val=32 00:09:03.490 13:21:22 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # IFS=: 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # read -r var val 00:09:03.490 13:21:22 -- accel/accel.sh@21 -- # val=1 00:09:03.490 13:21:22 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # IFS=: 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # read -r var val 00:09:03.490 13:21:22 -- accel/accel.sh@21 -- # val='1 seconds' 00:09:03.490 13:21:22 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # IFS=: 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # read -r var val 00:09:03.490 13:21:22 -- accel/accel.sh@21 -- # val=Yes 00:09:03.490 13:21:22 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # IFS=: 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # read -r var val 00:09:03.490 13:21:22 -- accel/accel.sh@21 -- # val= 00:09:03.490 13:21:22 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.490 13:21:22 -- accel/accel.sh@20 -- # IFS=: 00:09:03.491 13:21:22 -- accel/accel.sh@20 -- # read -r var val 00:09:03.491 13:21:22 -- accel/accel.sh@21 -- # val= 00:09:03.491 13:21:22 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.491 13:21:22 -- accel/accel.sh@20 -- # IFS=: 00:09:03.491 13:21:22 -- accel/accel.sh@20 -- # read -r var val 00:09:04.427 13:21:23 -- accel/accel.sh@21 -- # val= 00:09:04.427 13:21:23 -- accel/accel.sh@22 -- # case "$var" in 00:09:04.427 13:21:23 -- accel/accel.sh@20 -- # IFS=: 00:09:04.427 13:21:23 -- accel/accel.sh@20 -- # read -r var val 00:09:04.427 13:21:23 -- accel/accel.sh@21 -- # val= 00:09:04.427 13:21:23 -- accel/accel.sh@22 -- # case "$var" in 00:09:04.427 13:21:23 -- accel/accel.sh@20 -- # IFS=: 00:09:04.427 13:21:23 -- accel/accel.sh@20 -- # read -r var val 00:09:04.427 13:21:23 -- accel/accel.sh@21 -- # val= 00:09:04.427 13:21:23 -- accel/accel.sh@22 -- # case "$var" in 00:09:04.427 13:21:23 -- accel/accel.sh@20 -- # IFS=: 00:09:04.427 13:21:23 -- accel/accel.sh@20 -- # read -r var val 00:09:04.427 13:21:23 -- accel/accel.sh@21 -- # val= 00:09:04.427 13:21:23 -- accel/accel.sh@22 -- # case "$var" in 00:09:04.427 13:21:23 -- accel/accel.sh@20 -- # IFS=: 00:09:04.427 13:21:23 -- accel/accel.sh@20 -- # read -r var val 00:09:04.427 13:21:23 -- accel/accel.sh@21 -- # val= 00:09:04.427 13:21:23 -- accel/accel.sh@22 -- # case "$var" in 00:09:04.427 13:21:23 -- accel/accel.sh@20 -- # IFS=: 00:09:04.427 13:21:23 -- accel/accel.sh@20 -- # read -r var val 00:09:04.427 13:21:23 -- accel/accel.sh@21 -- # val= 00:09:04.427 13:21:23 -- accel/accel.sh@22 -- # case "$var" in 00:09:04.427 13:21:23 -- accel/accel.sh@20 -- # IFS=: 00:09:04.427 13:21:23 -- accel/accel.sh@20 -- # read -r var val 00:09:04.427 13:21:23 -- accel/accel.sh@28 -- # [[ -n software ]] 00:09:04.428 13:21:23 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:09:04.428 13:21:23 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:04.428 00:09:04.428 real 0m2.743s 00:09:04.428 user 0m2.419s 00:09:04.428 sys 0m0.321s 00:09:04.428 13:21:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:04.428 13:21:23 -- common/autotest_common.sh@10 -- # set +x 00:09:04.428 ************************************ 00:09:04.428 END TEST accel_decmop_full 00:09:04.428 ************************************ 00:09:04.716 13:21:23 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:04.716 13:21:23 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:09:04.716 13:21:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:04.716 13:21:23 -- common/autotest_common.sh@10 -- # set +x 00:09:04.716 ************************************ 00:09:04.716 START TEST accel_decomp_mcore 00:09:04.716 ************************************ 00:09:04.716 13:21:23 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:04.716 13:21:23 -- accel/accel.sh@16 -- # local accel_opc 00:09:04.716 13:21:23 -- accel/accel.sh@17 -- # local accel_module 00:09:04.716 13:21:23 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:04.716 13:21:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:04.716 13:21:23 -- accel/accel.sh@12 -- # build_accel_config 00:09:04.716 13:21:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:04.716 13:21:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:04.716 13:21:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:04.716 13:21:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:04.716 13:21:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:04.716 13:21:23 -- accel/accel.sh@41 -- # local IFS=, 00:09:04.716 13:21:23 -- accel/accel.sh@42 -- # jq -r . 00:09:04.716 [2024-07-24 13:21:23.325361] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:04.716 [2024-07-24 13:21:23.325454] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3164644 ] 00:09:04.716 EAL: No free 2048 kB hugepages reported on node 1 00:09:04.716 [2024-07-24 13:21:23.434725] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:04.716 [2024-07-24 13:21:23.483891] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:04.716 [2024-07-24 13:21:23.483979] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:09:04.716 [2024-07-24 13:21:23.484080] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:04.716 [2024-07-24 13:21:23.484080] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:09:06.091 13:21:24 -- accel/accel.sh@18 -- # out='Preparing input file... 00:09:06.091 00:09:06.091 SPDK Configuration: 00:09:06.091 Core mask: 0xf 00:09:06.091 00:09:06.091 Accel Perf Configuration: 00:09:06.091 Workload Type: decompress 00:09:06.091 Transfer size: 4096 bytes 00:09:06.091 Vector count 1 00:09:06.091 Module: software 00:09:06.091 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:06.091 Queue depth: 32 00:09:06.091 Allocate depth: 32 00:09:06.091 # threads/core: 1 00:09:06.091 Run time: 1 seconds 00:09:06.091 Verify: Yes 00:09:06.091 00:09:06.091 Running for 1 seconds... 00:09:06.091 00:09:06.091 Core,Thread Transfers Bandwidth Failed Miscompares 00:09:06.091 ------------------------------------------------------------------------------------ 00:09:06.091 0,0 54400/s 100 MiB/s 0 0 00:09:06.091 3,0 54784/s 100 MiB/s 0 0 00:09:06.091 2,0 76256/s 140 MiB/s 0 0 00:09:06.091 1,0 54720/s 100 MiB/s 0 0 00:09:06.091 ==================================================================================== 00:09:06.091 Total 240160/s 938 MiB/s 0 0' 00:09:06.091 13:21:24 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:06.091 13:21:24 -- accel/accel.sh@20 -- # IFS=: 00:09:06.091 13:21:24 -- accel/accel.sh@20 -- # read -r var val 00:09:06.091 13:21:24 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:06.091 13:21:24 -- accel/accel.sh@12 -- # build_accel_config 00:09:06.091 13:21:24 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:06.091 13:21:24 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:06.091 13:21:24 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:06.091 13:21:24 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:06.091 13:21:24 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:06.091 13:21:24 -- accel/accel.sh@41 -- # local IFS=, 00:09:06.091 13:21:24 -- accel/accel.sh@42 -- # jq -r . 00:09:06.091 [2024-07-24 13:21:24.690030] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:06.091 [2024-07-24 13:21:24.690102] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3164851 ] 00:09:06.091 EAL: No free 2048 kB hugepages reported on node 1 00:09:06.091 [2024-07-24 13:21:24.795862] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:06.091 [2024-07-24 13:21:24.843468] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:06.091 [2024-07-24 13:21:24.843556] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:09:06.091 [2024-07-24 13:21:24.843661] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:09:06.091 [2024-07-24 13:21:24.843661] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:06.091 13:21:24 -- accel/accel.sh@21 -- # val= 00:09:06.091 13:21:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.091 13:21:24 -- accel/accel.sh@20 -- # IFS=: 00:09:06.091 13:21:24 -- accel/accel.sh@20 -- # read -r var val 00:09:06.091 13:21:24 -- accel/accel.sh@21 -- # val= 00:09:06.091 13:21:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.091 13:21:24 -- accel/accel.sh@20 -- # IFS=: 00:09:06.091 13:21:24 -- accel/accel.sh@20 -- # read -r var val 00:09:06.091 13:21:24 -- accel/accel.sh@21 -- # val= 00:09:06.091 13:21:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.091 13:21:24 -- accel/accel.sh@20 -- # IFS=: 00:09:06.091 13:21:24 -- accel/accel.sh@20 -- # read -r var val 00:09:06.091 13:21:24 -- accel/accel.sh@21 -- # val=0xf 00:09:06.091 13:21:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.091 13:21:24 -- accel/accel.sh@20 -- # IFS=: 00:09:06.091 13:21:24 -- accel/accel.sh@20 -- # read -r var val 00:09:06.091 13:21:24 -- accel/accel.sh@21 -- # val= 00:09:06.091 13:21:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.091 13:21:24 -- accel/accel.sh@20 -- # IFS=: 00:09:06.091 13:21:24 -- accel/accel.sh@20 -- # read -r var val 00:09:06.091 13:21:24 -- accel/accel.sh@21 -- # val= 00:09:06.091 13:21:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.091 13:21:24 -- accel/accel.sh@20 -- # IFS=: 00:09:06.091 13:21:24 -- accel/accel.sh@20 -- # read -r var val 00:09:06.091 13:21:24 -- accel/accel.sh@21 -- # val=decompress 00:09:06.091 13:21:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.091 13:21:24 -- accel/accel.sh@24 -- # accel_opc=decompress 00:09:06.091 13:21:24 -- accel/accel.sh@20 -- # IFS=: 00:09:06.091 13:21:24 -- accel/accel.sh@20 -- # read -r var val 00:09:06.091 13:21:24 -- accel/accel.sh@21 -- # val='4096 bytes' 00:09:06.091 13:21:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.091 13:21:24 -- accel/accel.sh@20 -- # IFS=: 00:09:06.091 13:21:24 -- accel/accel.sh@20 -- # read -r var val 00:09:06.091 13:21:24 -- accel/accel.sh@21 -- # val= 00:09:06.092 13:21:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.092 13:21:24 -- accel/accel.sh@20 -- # IFS=: 00:09:06.092 13:21:24 -- accel/accel.sh@20 -- # read -r var val 00:09:06.092 13:21:24 -- accel/accel.sh@21 -- # val=software 00:09:06.092 13:21:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.092 13:21:24 -- accel/accel.sh@23 -- # accel_module=software 00:09:06.092 13:21:24 -- accel/accel.sh@20 -- # IFS=: 00:09:06.092 13:21:24 -- accel/accel.sh@20 -- # read -r var val 00:09:06.092 13:21:24 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:06.092 13:21:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.092 13:21:24 -- accel/accel.sh@20 -- # IFS=: 00:09:06.092 13:21:24 -- accel/accel.sh@20 -- # read -r var val 00:09:06.092 13:21:24 -- accel/accel.sh@21 -- # val=32 00:09:06.092 13:21:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.092 13:21:24 -- accel/accel.sh@20 -- # IFS=: 00:09:06.092 13:21:24 -- accel/accel.sh@20 -- # read -r var val 00:09:06.092 13:21:24 -- accel/accel.sh@21 -- # val=32 00:09:06.092 13:21:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.092 13:21:24 -- accel/accel.sh@20 -- # IFS=: 00:09:06.092 13:21:24 -- accel/accel.sh@20 -- # read -r var val 00:09:06.092 13:21:24 -- accel/accel.sh@21 -- # val=1 00:09:06.092 13:21:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.092 13:21:24 -- accel/accel.sh@20 -- # IFS=: 00:09:06.092 13:21:24 -- accel/accel.sh@20 -- # read -r var val 00:09:06.092 13:21:24 -- accel/accel.sh@21 -- # val='1 seconds' 00:09:06.092 13:21:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.092 13:21:24 -- accel/accel.sh@20 -- # IFS=: 00:09:06.092 13:21:24 -- accel/accel.sh@20 -- # read -r var val 00:09:06.092 13:21:24 -- accel/accel.sh@21 -- # val=Yes 00:09:06.092 13:21:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.092 13:21:24 -- accel/accel.sh@20 -- # IFS=: 00:09:06.092 13:21:24 -- accel/accel.sh@20 -- # read -r var val 00:09:06.092 13:21:24 -- accel/accel.sh@21 -- # val= 00:09:06.092 13:21:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.092 13:21:24 -- accel/accel.sh@20 -- # IFS=: 00:09:06.092 13:21:24 -- accel/accel.sh@20 -- # read -r var val 00:09:06.092 13:21:24 -- accel/accel.sh@21 -- # val= 00:09:06.092 13:21:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.092 13:21:24 -- accel/accel.sh@20 -- # IFS=: 00:09:06.092 13:21:24 -- accel/accel.sh@20 -- # read -r var val 00:09:07.469 13:21:26 -- accel/accel.sh@21 -- # val= 00:09:07.469 13:21:26 -- accel/accel.sh@22 -- # case "$var" in 00:09:07.469 13:21:26 -- accel/accel.sh@20 -- # IFS=: 00:09:07.469 13:21:26 -- accel/accel.sh@20 -- # read -r var val 00:09:07.469 13:21:26 -- accel/accel.sh@21 -- # val= 00:09:07.469 13:21:26 -- accel/accel.sh@22 -- # case "$var" in 00:09:07.469 13:21:26 -- accel/accel.sh@20 -- # IFS=: 00:09:07.469 13:21:26 -- accel/accel.sh@20 -- # read -r var val 00:09:07.469 13:21:26 -- accel/accel.sh@21 -- # val= 00:09:07.469 13:21:26 -- accel/accel.sh@22 -- # case "$var" in 00:09:07.469 13:21:26 -- accel/accel.sh@20 -- # IFS=: 00:09:07.469 13:21:26 -- accel/accel.sh@20 -- # read -r var val 00:09:07.469 13:21:26 -- accel/accel.sh@21 -- # val= 00:09:07.469 13:21:26 -- accel/accel.sh@22 -- # case "$var" in 00:09:07.469 13:21:26 -- accel/accel.sh@20 -- # IFS=: 00:09:07.469 13:21:26 -- accel/accel.sh@20 -- # read -r var val 00:09:07.469 13:21:26 -- accel/accel.sh@21 -- # val= 00:09:07.469 13:21:26 -- accel/accel.sh@22 -- # case "$var" in 00:09:07.469 13:21:26 -- accel/accel.sh@20 -- # IFS=: 00:09:07.469 13:21:26 -- accel/accel.sh@20 -- # read -r var val 00:09:07.469 13:21:26 -- accel/accel.sh@21 -- # val= 00:09:07.469 13:21:26 -- accel/accel.sh@22 -- # case "$var" in 00:09:07.469 13:21:26 -- accel/accel.sh@20 -- # IFS=: 00:09:07.469 13:21:26 -- accel/accel.sh@20 -- # read -r var val 00:09:07.469 13:21:26 -- accel/accel.sh@21 -- # val= 00:09:07.469 13:21:26 -- accel/accel.sh@22 -- # case "$var" in 00:09:07.469 13:21:26 -- accel/accel.sh@20 -- # IFS=: 00:09:07.469 13:21:26 -- accel/accel.sh@20 -- # read -r var val 00:09:07.469 13:21:26 -- accel/accel.sh@21 -- # val= 00:09:07.469 13:21:26 -- accel/accel.sh@22 -- # case "$var" in 00:09:07.469 13:21:26 -- accel/accel.sh@20 -- # IFS=: 00:09:07.469 13:21:26 -- accel/accel.sh@20 -- # read -r var val 00:09:07.469 13:21:26 -- accel/accel.sh@21 -- # val= 00:09:07.469 13:21:26 -- accel/accel.sh@22 -- # case "$var" in 00:09:07.469 13:21:26 -- accel/accel.sh@20 -- # IFS=: 00:09:07.469 13:21:26 -- accel/accel.sh@20 -- # read -r var val 00:09:07.469 13:21:26 -- accel/accel.sh@28 -- # [[ -n software ]] 00:09:07.469 13:21:26 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:09:07.469 13:21:26 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:07.469 00:09:07.469 real 0m2.739s 00:09:07.469 user 0m9.138s 00:09:07.469 sys 0m0.337s 00:09:07.469 13:21:26 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:07.469 13:21:26 -- common/autotest_common.sh@10 -- # set +x 00:09:07.469 ************************************ 00:09:07.469 END TEST accel_decomp_mcore 00:09:07.469 ************************************ 00:09:07.469 13:21:26 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:07.469 13:21:26 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:09:07.469 13:21:26 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:07.469 13:21:26 -- common/autotest_common.sh@10 -- # set +x 00:09:07.469 ************************************ 00:09:07.469 START TEST accel_decomp_full_mcore 00:09:07.469 ************************************ 00:09:07.469 13:21:26 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:07.469 13:21:26 -- accel/accel.sh@16 -- # local accel_opc 00:09:07.469 13:21:26 -- accel/accel.sh@17 -- # local accel_module 00:09:07.469 13:21:26 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:07.469 13:21:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:07.469 13:21:26 -- accel/accel.sh@12 -- # build_accel_config 00:09:07.469 13:21:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:07.469 13:21:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:07.469 13:21:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:07.469 13:21:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:07.469 13:21:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:07.469 13:21:26 -- accel/accel.sh@41 -- # local IFS=, 00:09:07.469 13:21:26 -- accel/accel.sh@42 -- # jq -r . 00:09:07.469 [2024-07-24 13:21:26.115088] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:07.469 [2024-07-24 13:21:26.115178] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3165082 ] 00:09:07.469 EAL: No free 2048 kB hugepages reported on node 1 00:09:07.469 [2024-07-24 13:21:26.237630] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:07.469 [2024-07-24 13:21:26.288361] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:07.469 [2024-07-24 13:21:26.288447] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:09:07.469 [2024-07-24 13:21:26.288548] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:09:07.469 [2024-07-24 13:21:26.288548] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:08.847 13:21:27 -- accel/accel.sh@18 -- # out='Preparing input file... 00:09:08.847 00:09:08.847 SPDK Configuration: 00:09:08.847 Core mask: 0xf 00:09:08.847 00:09:08.847 Accel Perf Configuration: 00:09:08.847 Workload Type: decompress 00:09:08.847 Transfer size: 111250 bytes 00:09:08.847 Vector count 1 00:09:08.847 Module: software 00:09:08.847 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:08.847 Queue depth: 32 00:09:08.847 Allocate depth: 32 00:09:08.847 # threads/core: 1 00:09:08.847 Run time: 1 seconds 00:09:08.847 Verify: Yes 00:09:08.847 00:09:08.847 Running for 1 seconds... 00:09:08.847 00:09:08.847 Core,Thread Transfers Bandwidth Failed Miscompares 00:09:08.847 ------------------------------------------------------------------------------------ 00:09:08.847 0,0 3872/s 159 MiB/s 0 0 00:09:08.847 3,0 3872/s 159 MiB/s 0 0 00:09:08.847 2,0 5664/s 233 MiB/s 0 0 00:09:08.847 1,0 3872/s 159 MiB/s 0 0 00:09:08.847 ==================================================================================== 00:09:08.847 Total 17280/s 1833 MiB/s 0 0' 00:09:08.847 13:21:27 -- accel/accel.sh@20 -- # IFS=: 00:09:08.847 13:21:27 -- accel/accel.sh@20 -- # read -r var val 00:09:08.847 13:21:27 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:08.847 13:21:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:08.847 13:21:27 -- accel/accel.sh@12 -- # build_accel_config 00:09:08.847 13:21:27 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:08.847 13:21:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:08.847 13:21:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:08.847 13:21:27 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:08.847 13:21:27 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:08.847 13:21:27 -- accel/accel.sh@41 -- # local IFS=, 00:09:08.847 13:21:27 -- accel/accel.sh@42 -- # jq -r . 00:09:08.847 [2024-07-24 13:21:27.519506] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:08.847 [2024-07-24 13:21:27.519601] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3165289 ] 00:09:08.847 EAL: No free 2048 kB hugepages reported on node 1 00:09:08.847 [2024-07-24 13:21:27.640593] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:08.847 [2024-07-24 13:21:27.691268] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:08.847 [2024-07-24 13:21:27.691355] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:09:08.847 [2024-07-24 13:21:27.691461] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:09:08.847 [2024-07-24 13:21:27.691462] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:09.107 13:21:27 -- accel/accel.sh@21 -- # val= 00:09:09.107 13:21:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # IFS=: 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # read -r var val 00:09:09.107 13:21:27 -- accel/accel.sh@21 -- # val= 00:09:09.107 13:21:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # IFS=: 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # read -r var val 00:09:09.107 13:21:27 -- accel/accel.sh@21 -- # val= 00:09:09.107 13:21:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # IFS=: 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # read -r var val 00:09:09.107 13:21:27 -- accel/accel.sh@21 -- # val=0xf 00:09:09.107 13:21:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # IFS=: 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # read -r var val 00:09:09.107 13:21:27 -- accel/accel.sh@21 -- # val= 00:09:09.107 13:21:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # IFS=: 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # read -r var val 00:09:09.107 13:21:27 -- accel/accel.sh@21 -- # val= 00:09:09.107 13:21:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # IFS=: 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # read -r var val 00:09:09.107 13:21:27 -- accel/accel.sh@21 -- # val=decompress 00:09:09.107 13:21:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.107 13:21:27 -- accel/accel.sh@24 -- # accel_opc=decompress 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # IFS=: 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # read -r var val 00:09:09.107 13:21:27 -- accel/accel.sh@21 -- # val='111250 bytes' 00:09:09.107 13:21:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # IFS=: 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # read -r var val 00:09:09.107 13:21:27 -- accel/accel.sh@21 -- # val= 00:09:09.107 13:21:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # IFS=: 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # read -r var val 00:09:09.107 13:21:27 -- accel/accel.sh@21 -- # val=software 00:09:09.107 13:21:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.107 13:21:27 -- accel/accel.sh@23 -- # accel_module=software 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # IFS=: 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # read -r var val 00:09:09.107 13:21:27 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:09.107 13:21:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # IFS=: 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # read -r var val 00:09:09.107 13:21:27 -- accel/accel.sh@21 -- # val=32 00:09:09.107 13:21:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # IFS=: 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # read -r var val 00:09:09.107 13:21:27 -- accel/accel.sh@21 -- # val=32 00:09:09.107 13:21:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # IFS=: 00:09:09.107 13:21:27 -- accel/accel.sh@20 -- # read -r var val 00:09:09.108 13:21:27 -- accel/accel.sh@21 -- # val=1 00:09:09.108 13:21:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.108 13:21:27 -- accel/accel.sh@20 -- # IFS=: 00:09:09.108 13:21:27 -- accel/accel.sh@20 -- # read -r var val 00:09:09.108 13:21:27 -- accel/accel.sh@21 -- # val='1 seconds' 00:09:09.108 13:21:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.108 13:21:27 -- accel/accel.sh@20 -- # IFS=: 00:09:09.108 13:21:27 -- accel/accel.sh@20 -- # read -r var val 00:09:09.108 13:21:27 -- accel/accel.sh@21 -- # val=Yes 00:09:09.108 13:21:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.108 13:21:27 -- accel/accel.sh@20 -- # IFS=: 00:09:09.108 13:21:27 -- accel/accel.sh@20 -- # read -r var val 00:09:09.108 13:21:27 -- accel/accel.sh@21 -- # val= 00:09:09.108 13:21:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.108 13:21:27 -- accel/accel.sh@20 -- # IFS=: 00:09:09.108 13:21:27 -- accel/accel.sh@20 -- # read -r var val 00:09:09.108 13:21:27 -- accel/accel.sh@21 -- # val= 00:09:09.108 13:21:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.108 13:21:27 -- accel/accel.sh@20 -- # IFS=: 00:09:09.108 13:21:27 -- accel/accel.sh@20 -- # read -r var val 00:09:10.046 13:21:28 -- accel/accel.sh@21 -- # val= 00:09:10.046 13:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:09:10.046 13:21:28 -- accel/accel.sh@20 -- # IFS=: 00:09:10.046 13:21:28 -- accel/accel.sh@20 -- # read -r var val 00:09:10.046 13:21:28 -- accel/accel.sh@21 -- # val= 00:09:10.046 13:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:09:10.046 13:21:28 -- accel/accel.sh@20 -- # IFS=: 00:09:10.046 13:21:28 -- accel/accel.sh@20 -- # read -r var val 00:09:10.046 13:21:28 -- accel/accel.sh@21 -- # val= 00:09:10.046 13:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:09:10.046 13:21:28 -- accel/accel.sh@20 -- # IFS=: 00:09:10.046 13:21:28 -- accel/accel.sh@20 -- # read -r var val 00:09:10.046 13:21:28 -- accel/accel.sh@21 -- # val= 00:09:10.046 13:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:09:10.046 13:21:28 -- accel/accel.sh@20 -- # IFS=: 00:09:10.046 13:21:28 -- accel/accel.sh@20 -- # read -r var val 00:09:10.046 13:21:28 -- accel/accel.sh@21 -- # val= 00:09:10.046 13:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:09:10.046 13:21:28 -- accel/accel.sh@20 -- # IFS=: 00:09:10.046 13:21:28 -- accel/accel.sh@20 -- # read -r var val 00:09:10.046 13:21:28 -- accel/accel.sh@21 -- # val= 00:09:10.046 13:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:09:10.046 13:21:28 -- accel/accel.sh@20 -- # IFS=: 00:09:10.046 13:21:28 -- accel/accel.sh@20 -- # read -r var val 00:09:10.046 13:21:28 -- accel/accel.sh@21 -- # val= 00:09:10.046 13:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:09:10.046 13:21:28 -- accel/accel.sh@20 -- # IFS=: 00:09:10.046 13:21:28 -- accel/accel.sh@20 -- # read -r var val 00:09:10.046 13:21:28 -- accel/accel.sh@21 -- # val= 00:09:10.046 13:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:09:10.046 13:21:28 -- accel/accel.sh@20 -- # IFS=: 00:09:10.305 13:21:28 -- accel/accel.sh@20 -- # read -r var val 00:09:10.305 13:21:28 -- accel/accel.sh@21 -- # val= 00:09:10.305 13:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:09:10.305 13:21:28 -- accel/accel.sh@20 -- # IFS=: 00:09:10.305 13:21:28 -- accel/accel.sh@20 -- # read -r var val 00:09:10.305 13:21:28 -- accel/accel.sh@28 -- # [[ -n software ]] 00:09:10.305 13:21:28 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:09:10.305 13:21:28 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:10.305 00:09:10.305 real 0m2.820s 00:09:10.305 user 0m9.244s 00:09:10.305 sys 0m0.401s 00:09:10.305 13:21:28 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:10.305 13:21:28 -- common/autotest_common.sh@10 -- # set +x 00:09:10.305 ************************************ 00:09:10.305 END TEST accel_decomp_full_mcore 00:09:10.305 ************************************ 00:09:10.305 13:21:28 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:10.305 13:21:28 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:09:10.305 13:21:28 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:10.305 13:21:28 -- common/autotest_common.sh@10 -- # set +x 00:09:10.305 ************************************ 00:09:10.305 START TEST accel_decomp_mthread 00:09:10.305 ************************************ 00:09:10.305 13:21:28 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:10.305 13:21:28 -- accel/accel.sh@16 -- # local accel_opc 00:09:10.305 13:21:28 -- accel/accel.sh@17 -- # local accel_module 00:09:10.305 13:21:28 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:10.305 13:21:28 -- accel/accel.sh@12 -- # build_accel_config 00:09:10.305 13:21:28 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:10.305 13:21:28 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:10.305 13:21:28 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:10.305 13:21:28 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:10.305 13:21:28 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:10.305 13:21:28 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:10.305 13:21:28 -- accel/accel.sh@41 -- # local IFS=, 00:09:10.305 13:21:28 -- accel/accel.sh@42 -- # jq -r . 00:09:10.305 [2024-07-24 13:21:28.981407] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:10.305 [2024-07-24 13:21:28.981500] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3165545 ] 00:09:10.305 EAL: No free 2048 kB hugepages reported on node 1 00:09:10.305 [2024-07-24 13:21:29.102763] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:10.306 [2024-07-24 13:21:29.150018] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:11.682 13:21:30 -- accel/accel.sh@18 -- # out='Preparing input file... 00:09:11.682 00:09:11.682 SPDK Configuration: 00:09:11.682 Core mask: 0x1 00:09:11.682 00:09:11.682 Accel Perf Configuration: 00:09:11.682 Workload Type: decompress 00:09:11.682 Transfer size: 4096 bytes 00:09:11.682 Vector count 1 00:09:11.682 Module: software 00:09:11.682 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:11.682 Queue depth: 32 00:09:11.682 Allocate depth: 32 00:09:11.682 # threads/core: 2 00:09:11.683 Run time: 1 seconds 00:09:11.683 Verify: Yes 00:09:11.683 00:09:11.683 Running for 1 seconds... 00:09:11.683 00:09:11.683 Core,Thread Transfers Bandwidth Failed Miscompares 00:09:11.683 ------------------------------------------------------------------------------------ 00:09:11.683 0,1 30912/s 56 MiB/s 0 0 00:09:11.683 0,0 30816/s 56 MiB/s 0 0 00:09:11.683 ==================================================================================== 00:09:11.683 Total 61728/s 241 MiB/s 0 0' 00:09:11.683 13:21:30 -- accel/accel.sh@20 -- # IFS=: 00:09:11.683 13:21:30 -- accel/accel.sh@20 -- # read -r var val 00:09:11.683 13:21:30 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:11.683 13:21:30 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:11.683 13:21:30 -- accel/accel.sh@12 -- # build_accel_config 00:09:11.683 13:21:30 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:11.683 13:21:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:11.683 13:21:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:11.683 13:21:30 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:11.683 13:21:30 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:11.683 13:21:30 -- accel/accel.sh@41 -- # local IFS=, 00:09:11.683 13:21:30 -- accel/accel.sh@42 -- # jq -r . 00:09:11.683 [2024-07-24 13:21:30.376302] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:11.683 [2024-07-24 13:21:30.376397] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3165723 ] 00:09:11.683 EAL: No free 2048 kB hugepages reported on node 1 00:09:11.683 [2024-07-24 13:21:30.495963] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:11.683 [2024-07-24 13:21:30.543713] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:11.942 13:21:30 -- accel/accel.sh@21 -- # val= 00:09:11.942 13:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.942 13:21:30 -- accel/accel.sh@20 -- # IFS=: 00:09:11.942 13:21:30 -- accel/accel.sh@20 -- # read -r var val 00:09:11.942 13:21:30 -- accel/accel.sh@21 -- # val= 00:09:11.942 13:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.942 13:21:30 -- accel/accel.sh@20 -- # IFS=: 00:09:11.942 13:21:30 -- accel/accel.sh@20 -- # read -r var val 00:09:11.942 13:21:30 -- accel/accel.sh@21 -- # val= 00:09:11.942 13:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.942 13:21:30 -- accel/accel.sh@20 -- # IFS=: 00:09:11.942 13:21:30 -- accel/accel.sh@20 -- # read -r var val 00:09:11.942 13:21:30 -- accel/accel.sh@21 -- # val=0x1 00:09:11.942 13:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.942 13:21:30 -- accel/accel.sh@20 -- # IFS=: 00:09:11.942 13:21:30 -- accel/accel.sh@20 -- # read -r var val 00:09:11.942 13:21:30 -- accel/accel.sh@21 -- # val= 00:09:11.942 13:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.942 13:21:30 -- accel/accel.sh@20 -- # IFS=: 00:09:11.942 13:21:30 -- accel/accel.sh@20 -- # read -r var val 00:09:11.942 13:21:30 -- accel/accel.sh@21 -- # val= 00:09:11.943 13:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # IFS=: 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # read -r var val 00:09:11.943 13:21:30 -- accel/accel.sh@21 -- # val=decompress 00:09:11.943 13:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.943 13:21:30 -- accel/accel.sh@24 -- # accel_opc=decompress 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # IFS=: 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # read -r var val 00:09:11.943 13:21:30 -- accel/accel.sh@21 -- # val='4096 bytes' 00:09:11.943 13:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # IFS=: 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # read -r var val 00:09:11.943 13:21:30 -- accel/accel.sh@21 -- # val= 00:09:11.943 13:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # IFS=: 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # read -r var val 00:09:11.943 13:21:30 -- accel/accel.sh@21 -- # val=software 00:09:11.943 13:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.943 13:21:30 -- accel/accel.sh@23 -- # accel_module=software 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # IFS=: 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # read -r var val 00:09:11.943 13:21:30 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:11.943 13:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # IFS=: 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # read -r var val 00:09:11.943 13:21:30 -- accel/accel.sh@21 -- # val=32 00:09:11.943 13:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # IFS=: 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # read -r var val 00:09:11.943 13:21:30 -- accel/accel.sh@21 -- # val=32 00:09:11.943 13:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # IFS=: 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # read -r var val 00:09:11.943 13:21:30 -- accel/accel.sh@21 -- # val=2 00:09:11.943 13:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # IFS=: 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # read -r var val 00:09:11.943 13:21:30 -- accel/accel.sh@21 -- # val='1 seconds' 00:09:11.943 13:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # IFS=: 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # read -r var val 00:09:11.943 13:21:30 -- accel/accel.sh@21 -- # val=Yes 00:09:11.943 13:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # IFS=: 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # read -r var val 00:09:11.943 13:21:30 -- accel/accel.sh@21 -- # val= 00:09:11.943 13:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # IFS=: 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # read -r var val 00:09:11.943 13:21:30 -- accel/accel.sh@21 -- # val= 00:09:11.943 13:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # IFS=: 00:09:11.943 13:21:30 -- accel/accel.sh@20 -- # read -r var val 00:09:13.322 13:21:31 -- accel/accel.sh@21 -- # val= 00:09:13.322 13:21:31 -- accel/accel.sh@22 -- # case "$var" in 00:09:13.322 13:21:31 -- accel/accel.sh@20 -- # IFS=: 00:09:13.322 13:21:31 -- accel/accel.sh@20 -- # read -r var val 00:09:13.322 13:21:31 -- accel/accel.sh@21 -- # val= 00:09:13.322 13:21:31 -- accel/accel.sh@22 -- # case "$var" in 00:09:13.322 13:21:31 -- accel/accel.sh@20 -- # IFS=: 00:09:13.322 13:21:31 -- accel/accel.sh@20 -- # read -r var val 00:09:13.322 13:21:31 -- accel/accel.sh@21 -- # val= 00:09:13.322 13:21:31 -- accel/accel.sh@22 -- # case "$var" in 00:09:13.322 13:21:31 -- accel/accel.sh@20 -- # IFS=: 00:09:13.322 13:21:31 -- accel/accel.sh@20 -- # read -r var val 00:09:13.322 13:21:31 -- accel/accel.sh@21 -- # val= 00:09:13.322 13:21:31 -- accel/accel.sh@22 -- # case "$var" in 00:09:13.322 13:21:31 -- accel/accel.sh@20 -- # IFS=: 00:09:13.322 13:21:31 -- accel/accel.sh@20 -- # read -r var val 00:09:13.322 13:21:31 -- accel/accel.sh@21 -- # val= 00:09:13.322 13:21:31 -- accel/accel.sh@22 -- # case "$var" in 00:09:13.322 13:21:31 -- accel/accel.sh@20 -- # IFS=: 00:09:13.322 13:21:31 -- accel/accel.sh@20 -- # read -r var val 00:09:13.322 13:21:31 -- accel/accel.sh@21 -- # val= 00:09:13.322 13:21:31 -- accel/accel.sh@22 -- # case "$var" in 00:09:13.322 13:21:31 -- accel/accel.sh@20 -- # IFS=: 00:09:13.322 13:21:31 -- accel/accel.sh@20 -- # read -r var val 00:09:13.322 13:21:31 -- accel/accel.sh@21 -- # val= 00:09:13.322 13:21:31 -- accel/accel.sh@22 -- # case "$var" in 00:09:13.322 13:21:31 -- accel/accel.sh@20 -- # IFS=: 00:09:13.322 13:21:31 -- accel/accel.sh@20 -- # read -r var val 00:09:13.322 13:21:31 -- accel/accel.sh@28 -- # [[ -n software ]] 00:09:13.322 13:21:31 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:09:13.322 13:21:31 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:13.322 00:09:13.322 real 0m2.796s 00:09:13.322 user 0m2.418s 00:09:13.322 sys 0m0.384s 00:09:13.322 13:21:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:13.322 13:21:31 -- common/autotest_common.sh@10 -- # set +x 00:09:13.322 ************************************ 00:09:13.322 END TEST accel_decomp_mthread 00:09:13.322 ************************************ 00:09:13.322 13:21:31 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:13.322 13:21:31 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:09:13.322 13:21:31 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:13.322 13:21:31 -- common/autotest_common.sh@10 -- # set +x 00:09:13.322 ************************************ 00:09:13.322 START TEST accel_deomp_full_mthread 00:09:13.322 ************************************ 00:09:13.322 13:21:31 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:13.322 13:21:31 -- accel/accel.sh@16 -- # local accel_opc 00:09:13.322 13:21:31 -- accel/accel.sh@17 -- # local accel_module 00:09:13.322 13:21:31 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:13.322 13:21:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:13.322 13:21:31 -- accel/accel.sh@12 -- # build_accel_config 00:09:13.322 13:21:31 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:13.322 13:21:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:13.322 13:21:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:13.322 13:21:31 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:13.322 13:21:31 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:13.322 13:21:31 -- accel/accel.sh@41 -- # local IFS=, 00:09:13.322 13:21:31 -- accel/accel.sh@42 -- # jq -r . 00:09:13.322 [2024-07-24 13:21:31.828241] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:13.322 [2024-07-24 13:21:31.828334] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3165923 ] 00:09:13.322 EAL: No free 2048 kB hugepages reported on node 1 00:09:13.323 [2024-07-24 13:21:31.949195] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:13.323 [2024-07-24 13:21:31.996468] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:14.699 13:21:33 -- accel/accel.sh@18 -- # out='Preparing input file... 00:09:14.699 00:09:14.699 SPDK Configuration: 00:09:14.699 Core mask: 0x1 00:09:14.699 00:09:14.699 Accel Perf Configuration: 00:09:14.699 Workload Type: decompress 00:09:14.699 Transfer size: 111250 bytes 00:09:14.699 Vector count 1 00:09:14.699 Module: software 00:09:14.699 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:14.699 Queue depth: 32 00:09:14.699 Allocate depth: 32 00:09:14.699 # threads/core: 2 00:09:14.699 Run time: 1 seconds 00:09:14.699 Verify: Yes 00:09:14.699 00:09:14.699 Running for 1 seconds... 00:09:14.699 00:09:14.699 Core,Thread Transfers Bandwidth Failed Miscompares 00:09:14.699 ------------------------------------------------------------------------------------ 00:09:14.699 0,1 1984/s 81 MiB/s 0 0 00:09:14.700 0,0 1952/s 80 MiB/s 0 0 00:09:14.700 ==================================================================================== 00:09:14.700 Total 3936/s 417 MiB/s 0 0' 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # IFS=: 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # read -r var val 00:09:14.700 13:21:33 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:14.700 13:21:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:14.700 13:21:33 -- accel/accel.sh@12 -- # build_accel_config 00:09:14.700 13:21:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:14.700 13:21:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:14.700 13:21:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:14.700 13:21:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:14.700 13:21:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:14.700 13:21:33 -- accel/accel.sh@41 -- # local IFS=, 00:09:14.700 13:21:33 -- accel/accel.sh@42 -- # jq -r . 00:09:14.700 [2024-07-24 13:21:33.249658] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:14.700 [2024-07-24 13:21:33.249753] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3166105 ] 00:09:14.700 EAL: No free 2048 kB hugepages reported on node 1 00:09:14.700 [2024-07-24 13:21:33.370758] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:14.700 [2024-07-24 13:21:33.417623] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:14.700 13:21:33 -- accel/accel.sh@21 -- # val= 00:09:14.700 13:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # IFS=: 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # read -r var val 00:09:14.700 13:21:33 -- accel/accel.sh@21 -- # val= 00:09:14.700 13:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # IFS=: 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # read -r var val 00:09:14.700 13:21:33 -- accel/accel.sh@21 -- # val= 00:09:14.700 13:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # IFS=: 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # read -r var val 00:09:14.700 13:21:33 -- accel/accel.sh@21 -- # val=0x1 00:09:14.700 13:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # IFS=: 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # read -r var val 00:09:14.700 13:21:33 -- accel/accel.sh@21 -- # val= 00:09:14.700 13:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # IFS=: 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # read -r var val 00:09:14.700 13:21:33 -- accel/accel.sh@21 -- # val= 00:09:14.700 13:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # IFS=: 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # read -r var val 00:09:14.700 13:21:33 -- accel/accel.sh@21 -- # val=decompress 00:09:14.700 13:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.700 13:21:33 -- accel/accel.sh@24 -- # accel_opc=decompress 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # IFS=: 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # read -r var val 00:09:14.700 13:21:33 -- accel/accel.sh@21 -- # val='111250 bytes' 00:09:14.700 13:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # IFS=: 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # read -r var val 00:09:14.700 13:21:33 -- accel/accel.sh@21 -- # val= 00:09:14.700 13:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # IFS=: 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # read -r var val 00:09:14.700 13:21:33 -- accel/accel.sh@21 -- # val=software 00:09:14.700 13:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.700 13:21:33 -- accel/accel.sh@23 -- # accel_module=software 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # IFS=: 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # read -r var val 00:09:14.700 13:21:33 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:14.700 13:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # IFS=: 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # read -r var val 00:09:14.700 13:21:33 -- accel/accel.sh@21 -- # val=32 00:09:14.700 13:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # IFS=: 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # read -r var val 00:09:14.700 13:21:33 -- accel/accel.sh@21 -- # val=32 00:09:14.700 13:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # IFS=: 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # read -r var val 00:09:14.700 13:21:33 -- accel/accel.sh@21 -- # val=2 00:09:14.700 13:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # IFS=: 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # read -r var val 00:09:14.700 13:21:33 -- accel/accel.sh@21 -- # val='1 seconds' 00:09:14.700 13:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # IFS=: 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # read -r var val 00:09:14.700 13:21:33 -- accel/accel.sh@21 -- # val=Yes 00:09:14.700 13:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # IFS=: 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # read -r var val 00:09:14.700 13:21:33 -- accel/accel.sh@21 -- # val= 00:09:14.700 13:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # IFS=: 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # read -r var val 00:09:14.700 13:21:33 -- accel/accel.sh@21 -- # val= 00:09:14.700 13:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # IFS=: 00:09:14.700 13:21:33 -- accel/accel.sh@20 -- # read -r var val 00:09:16.076 13:21:34 -- accel/accel.sh@21 -- # val= 00:09:16.076 13:21:34 -- accel/accel.sh@22 -- # case "$var" in 00:09:16.076 13:21:34 -- accel/accel.sh@20 -- # IFS=: 00:09:16.076 13:21:34 -- accel/accel.sh@20 -- # read -r var val 00:09:16.076 13:21:34 -- accel/accel.sh@21 -- # val= 00:09:16.076 13:21:34 -- accel/accel.sh@22 -- # case "$var" in 00:09:16.076 13:21:34 -- accel/accel.sh@20 -- # IFS=: 00:09:16.076 13:21:34 -- accel/accel.sh@20 -- # read -r var val 00:09:16.076 13:21:34 -- accel/accel.sh@21 -- # val= 00:09:16.076 13:21:34 -- accel/accel.sh@22 -- # case "$var" in 00:09:16.076 13:21:34 -- accel/accel.sh@20 -- # IFS=: 00:09:16.076 13:21:34 -- accel/accel.sh@20 -- # read -r var val 00:09:16.076 13:21:34 -- accel/accel.sh@21 -- # val= 00:09:16.076 13:21:34 -- accel/accel.sh@22 -- # case "$var" in 00:09:16.076 13:21:34 -- accel/accel.sh@20 -- # IFS=: 00:09:16.076 13:21:34 -- accel/accel.sh@20 -- # read -r var val 00:09:16.076 13:21:34 -- accel/accel.sh@21 -- # val= 00:09:16.076 13:21:34 -- accel/accel.sh@22 -- # case "$var" in 00:09:16.076 13:21:34 -- accel/accel.sh@20 -- # IFS=: 00:09:16.077 13:21:34 -- accel/accel.sh@20 -- # read -r var val 00:09:16.077 13:21:34 -- accel/accel.sh@21 -- # val= 00:09:16.077 13:21:34 -- accel/accel.sh@22 -- # case "$var" in 00:09:16.077 13:21:34 -- accel/accel.sh@20 -- # IFS=: 00:09:16.077 13:21:34 -- accel/accel.sh@20 -- # read -r var val 00:09:16.077 13:21:34 -- accel/accel.sh@21 -- # val= 00:09:16.077 13:21:34 -- accel/accel.sh@22 -- # case "$var" in 00:09:16.077 13:21:34 -- accel/accel.sh@20 -- # IFS=: 00:09:16.077 13:21:34 -- accel/accel.sh@20 -- # read -r var val 00:09:16.077 13:21:34 -- accel/accel.sh@28 -- # [[ -n software ]] 00:09:16.077 13:21:34 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:09:16.077 13:21:34 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:16.077 00:09:16.077 real 0m2.848s 00:09:16.077 user 0m2.480s 00:09:16.077 sys 0m0.373s 00:09:16.077 13:21:34 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:16.077 13:21:34 -- common/autotest_common.sh@10 -- # set +x 00:09:16.077 ************************************ 00:09:16.077 END TEST accel_deomp_full_mthread 00:09:16.077 ************************************ 00:09:16.077 13:21:34 -- accel/accel.sh@116 -- # [[ n == y ]] 00:09:16.077 13:21:34 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:09:16.077 13:21:34 -- accel/accel.sh@129 -- # build_accel_config 00:09:16.077 13:21:34 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:09:16.077 13:21:34 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:16.077 13:21:34 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:16.077 13:21:34 -- common/autotest_common.sh@10 -- # set +x 00:09:16.077 13:21:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:16.077 13:21:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:16.077 13:21:34 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:16.077 13:21:34 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:16.077 13:21:34 -- accel/accel.sh@41 -- # local IFS=, 00:09:16.077 13:21:34 -- accel/accel.sh@42 -- # jq -r . 00:09:16.077 ************************************ 00:09:16.077 START TEST accel_dif_functional_tests 00:09:16.077 ************************************ 00:09:16.077 13:21:34 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:09:16.077 [2024-07-24 13:21:34.729060] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:16.077 [2024-07-24 13:21:34.729137] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3166301 ] 00:09:16.077 EAL: No free 2048 kB hugepages reported on node 1 00:09:16.077 [2024-07-24 13:21:34.847664] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:09:16.077 [2024-07-24 13:21:34.901373] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:16.077 [2024-07-24 13:21:34.901474] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:09:16.077 [2024-07-24 13:21:34.901483] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:16.336 00:09:16.336 00:09:16.336 CUnit - A unit testing framework for C - Version 2.1-3 00:09:16.336 http://cunit.sourceforge.net/ 00:09:16.336 00:09:16.336 00:09:16.336 Suite: accel_dif 00:09:16.336 Test: verify: DIF generated, GUARD check ...passed 00:09:16.336 Test: verify: DIF generated, APPTAG check ...passed 00:09:16.336 Test: verify: DIF generated, REFTAG check ...passed 00:09:16.336 Test: verify: DIF not generated, GUARD check ...[2024-07-24 13:21:34.980314] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:09:16.336 [2024-07-24 13:21:34.980375] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:09:16.336 passed 00:09:16.336 Test: verify: DIF not generated, APPTAG check ...[2024-07-24 13:21:34.980420] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:09:16.336 [2024-07-24 13:21:34.980446] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:09:16.336 passed 00:09:16.336 Test: verify: DIF not generated, REFTAG check ...[2024-07-24 13:21:34.980477] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:09:16.336 [2024-07-24 13:21:34.980502] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:09:16.336 passed 00:09:16.336 Test: verify: APPTAG correct, APPTAG check ...passed 00:09:16.336 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-24 13:21:34.980561] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:09:16.336 passed 00:09:16.336 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:09:16.336 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:09:16.336 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:09:16.336 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-24 13:21:34.980699] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:09:16.336 passed 00:09:16.336 Test: generate copy: DIF generated, GUARD check ...passed 00:09:16.336 Test: generate copy: DIF generated, APTTAG check ...passed 00:09:16.336 Test: generate copy: DIF generated, REFTAG check ...passed 00:09:16.336 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:09:16.336 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:09:16.336 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:09:16.336 Test: generate copy: iovecs-len validate ...[2024-07-24 13:21:34.980933] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:09:16.336 passed 00:09:16.336 Test: generate copy: buffer alignment validate ...passed 00:09:16.336 00:09:16.336 Run Summary: Type Total Ran Passed Failed Inactive 00:09:16.336 suites 1 1 n/a 0 0 00:09:16.336 tests 20 20 20 0 0 00:09:16.336 asserts 204 204 204 0 n/a 00:09:16.336 00:09:16.336 Elapsed time = 0.003 seconds 00:09:16.336 00:09:16.336 real 0m0.461s 00:09:16.336 user 0m0.658s 00:09:16.336 sys 0m0.213s 00:09:16.336 13:21:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:16.336 13:21:35 -- common/autotest_common.sh@10 -- # set +x 00:09:16.336 ************************************ 00:09:16.336 END TEST accel_dif_functional_tests 00:09:16.336 ************************************ 00:09:16.595 00:09:16.595 real 0m58.898s 00:09:16.595 user 1m4.704s 00:09:16.595 sys 0m9.062s 00:09:16.595 13:21:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:16.595 13:21:35 -- common/autotest_common.sh@10 -- # set +x 00:09:16.595 ************************************ 00:09:16.595 END TEST accel 00:09:16.595 ************************************ 00:09:16.595 13:21:35 -- spdk/autotest.sh@190 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:09:16.595 13:21:35 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:09:16.595 13:21:35 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:16.595 13:21:35 -- common/autotest_common.sh@10 -- # set +x 00:09:16.595 ************************************ 00:09:16.595 START TEST accel_rpc 00:09:16.595 ************************************ 00:09:16.595 13:21:35 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:09:16.595 * Looking for test storage... 00:09:16.596 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:09:16.596 13:21:35 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:09:16.596 13:21:35 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=3166372 00:09:16.596 13:21:35 -- accel/accel_rpc.sh@15 -- # waitforlisten 3166372 00:09:16.596 13:21:35 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:09:16.596 13:21:35 -- common/autotest_common.sh@819 -- # '[' -z 3166372 ']' 00:09:16.596 13:21:35 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:16.596 13:21:35 -- common/autotest_common.sh@824 -- # local max_retries=100 00:09:16.596 13:21:35 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:16.596 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:16.596 13:21:35 -- common/autotest_common.sh@828 -- # xtrace_disable 00:09:16.596 13:21:35 -- common/autotest_common.sh@10 -- # set +x 00:09:16.596 [2024-07-24 13:21:35.383692] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:16.596 [2024-07-24 13:21:35.383788] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3166372 ] 00:09:16.596 EAL: No free 2048 kB hugepages reported on node 1 00:09:16.854 [2024-07-24 13:21:35.502169] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:16.854 [2024-07-24 13:21:35.550885] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:16.854 [2024-07-24 13:21:35.551045] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:16.854 13:21:35 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:09:16.854 13:21:35 -- common/autotest_common.sh@852 -- # return 0 00:09:16.854 13:21:35 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:09:16.854 13:21:35 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:09:16.854 13:21:35 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:09:16.854 13:21:35 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:09:16.854 13:21:35 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:09:16.854 13:21:35 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:09:16.854 13:21:35 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:16.854 13:21:35 -- common/autotest_common.sh@10 -- # set +x 00:09:16.854 ************************************ 00:09:16.854 START TEST accel_assign_opcode 00:09:16.854 ************************************ 00:09:16.854 13:21:35 -- common/autotest_common.sh@1104 -- # accel_assign_opcode_test_suite 00:09:16.854 13:21:35 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:09:16.854 13:21:35 -- common/autotest_common.sh@551 -- # xtrace_disable 00:09:16.854 13:21:35 -- common/autotest_common.sh@10 -- # set +x 00:09:16.854 [2024-07-24 13:21:35.615658] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:09:16.854 13:21:35 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:09:16.854 13:21:35 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:09:16.854 13:21:35 -- common/autotest_common.sh@551 -- # xtrace_disable 00:09:16.855 13:21:35 -- common/autotest_common.sh@10 -- # set +x 00:09:16.855 [2024-07-24 13:21:35.623670] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:09:16.855 13:21:35 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:09:16.855 13:21:35 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:09:16.855 13:21:35 -- common/autotest_common.sh@551 -- # xtrace_disable 00:09:16.855 13:21:35 -- common/autotest_common.sh@10 -- # set +x 00:09:17.113 13:21:35 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:09:17.113 13:21:35 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:09:17.114 13:21:35 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:09:17.114 13:21:35 -- common/autotest_common.sh@551 -- # xtrace_disable 00:09:17.114 13:21:35 -- common/autotest_common.sh@10 -- # set +x 00:09:17.114 13:21:35 -- accel/accel_rpc.sh@42 -- # grep software 00:09:17.114 13:21:35 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:09:17.114 software 00:09:17.114 00:09:17.114 real 0m0.265s 00:09:17.114 user 0m0.050s 00:09:17.114 sys 0m0.013s 00:09:17.114 13:21:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:17.114 13:21:35 -- common/autotest_common.sh@10 -- # set +x 00:09:17.114 ************************************ 00:09:17.114 END TEST accel_assign_opcode 00:09:17.114 ************************************ 00:09:17.114 13:21:35 -- accel/accel_rpc.sh@55 -- # killprocess 3166372 00:09:17.114 13:21:35 -- common/autotest_common.sh@926 -- # '[' -z 3166372 ']' 00:09:17.114 13:21:35 -- common/autotest_common.sh@930 -- # kill -0 3166372 00:09:17.114 13:21:35 -- common/autotest_common.sh@931 -- # uname 00:09:17.114 13:21:35 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:09:17.114 13:21:35 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3166372 00:09:17.114 13:21:35 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:09:17.114 13:21:35 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:09:17.114 13:21:35 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3166372' 00:09:17.114 killing process with pid 3166372 00:09:17.114 13:21:35 -- common/autotest_common.sh@945 -- # kill 3166372 00:09:17.114 13:21:35 -- common/autotest_common.sh@950 -- # wait 3166372 00:09:17.681 00:09:17.681 real 0m1.058s 00:09:17.681 user 0m0.935s 00:09:17.681 sys 0m0.521s 00:09:17.681 13:21:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:17.681 13:21:36 -- common/autotest_common.sh@10 -- # set +x 00:09:17.681 ************************************ 00:09:17.681 END TEST accel_rpc 00:09:17.681 ************************************ 00:09:17.681 13:21:36 -- spdk/autotest.sh@191 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:09:17.681 13:21:36 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:09:17.681 13:21:36 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:17.681 13:21:36 -- common/autotest_common.sh@10 -- # set +x 00:09:17.681 ************************************ 00:09:17.681 START TEST app_cmdline 00:09:17.681 ************************************ 00:09:17.681 13:21:36 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:09:17.681 * Looking for test storage... 00:09:17.681 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:09:17.681 13:21:36 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:09:17.681 13:21:36 -- app/cmdline.sh@17 -- # spdk_tgt_pid=3166615 00:09:17.681 13:21:36 -- app/cmdline.sh@18 -- # waitforlisten 3166615 00:09:17.682 13:21:36 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:09:17.682 13:21:36 -- common/autotest_common.sh@819 -- # '[' -z 3166615 ']' 00:09:17.682 13:21:36 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:17.682 13:21:36 -- common/autotest_common.sh@824 -- # local max_retries=100 00:09:17.682 13:21:36 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:17.682 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:17.682 13:21:36 -- common/autotest_common.sh@828 -- # xtrace_disable 00:09:17.682 13:21:36 -- common/autotest_common.sh@10 -- # set +x 00:09:17.682 [2024-07-24 13:21:36.498016] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:17.682 [2024-07-24 13:21:36.498093] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3166615 ] 00:09:17.941 EAL: No free 2048 kB hugepages reported on node 1 00:09:17.941 [2024-07-24 13:21:36.619600] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:17.941 [2024-07-24 13:21:36.664293] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:17.941 [2024-07-24 13:21:36.664450] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:18.878 13:21:37 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:09:18.878 13:21:37 -- common/autotest_common.sh@852 -- # return 0 00:09:18.878 13:21:37 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:09:18.878 { 00:09:18.878 "version": "SPDK v24.01.1-pre git sha1 dbef7efac", 00:09:18.878 "fields": { 00:09:18.878 "major": 24, 00:09:18.878 "minor": 1, 00:09:18.878 "patch": 1, 00:09:18.878 "suffix": "-pre", 00:09:18.878 "commit": "dbef7efac" 00:09:18.878 } 00:09:18.878 } 00:09:18.878 13:21:37 -- app/cmdline.sh@22 -- # expected_methods=() 00:09:18.878 13:21:37 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:09:18.878 13:21:37 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:09:18.878 13:21:37 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:09:18.878 13:21:37 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:09:18.878 13:21:37 -- common/autotest_common.sh@551 -- # xtrace_disable 00:09:18.878 13:21:37 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:09:18.878 13:21:37 -- common/autotest_common.sh@10 -- # set +x 00:09:18.878 13:21:37 -- app/cmdline.sh@26 -- # sort 00:09:18.878 13:21:37 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:09:18.878 13:21:37 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:09:18.878 13:21:37 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:09:18.878 13:21:37 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:09:18.878 13:21:37 -- common/autotest_common.sh@640 -- # local es=0 00:09:18.878 13:21:37 -- common/autotest_common.sh@642 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:09:18.878 13:21:37 -- common/autotest_common.sh@628 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:09:18.878 13:21:37 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:09:18.878 13:21:37 -- common/autotest_common.sh@632 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:09:18.878 13:21:37 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:09:18.878 13:21:37 -- common/autotest_common.sh@634 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:09:18.878 13:21:37 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:09:18.878 13:21:37 -- common/autotest_common.sh@634 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:09:18.878 13:21:37 -- common/autotest_common.sh@634 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:09:18.878 13:21:37 -- common/autotest_common.sh@643 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:09:19.137 request: 00:09:19.137 { 00:09:19.137 "method": "env_dpdk_get_mem_stats", 00:09:19.137 "req_id": 1 00:09:19.137 } 00:09:19.137 Got JSON-RPC error response 00:09:19.137 response: 00:09:19.137 { 00:09:19.137 "code": -32601, 00:09:19.137 "message": "Method not found" 00:09:19.137 } 00:09:19.137 13:21:37 -- common/autotest_common.sh@643 -- # es=1 00:09:19.137 13:21:37 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:09:19.137 13:21:37 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:09:19.137 13:21:37 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:09:19.137 13:21:37 -- app/cmdline.sh@1 -- # killprocess 3166615 00:09:19.137 13:21:37 -- common/autotest_common.sh@926 -- # '[' -z 3166615 ']' 00:09:19.137 13:21:37 -- common/autotest_common.sh@930 -- # kill -0 3166615 00:09:19.137 13:21:37 -- common/autotest_common.sh@931 -- # uname 00:09:19.137 13:21:37 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:09:19.137 13:21:37 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3166615 00:09:19.137 13:21:37 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:09:19.137 13:21:37 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:09:19.137 13:21:37 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3166615' 00:09:19.137 killing process with pid 3166615 00:09:19.137 13:21:37 -- common/autotest_common.sh@945 -- # kill 3166615 00:09:19.137 13:21:37 -- common/autotest_common.sh@950 -- # wait 3166615 00:09:19.706 00:09:19.706 real 0m1.924s 00:09:19.706 user 0m2.304s 00:09:19.706 sys 0m0.585s 00:09:19.706 13:21:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:19.706 13:21:38 -- common/autotest_common.sh@10 -- # set +x 00:09:19.706 ************************************ 00:09:19.706 END TEST app_cmdline 00:09:19.706 ************************************ 00:09:19.706 13:21:38 -- spdk/autotest.sh@192 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:09:19.706 13:21:38 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:09:19.706 13:21:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:19.706 13:21:38 -- common/autotest_common.sh@10 -- # set +x 00:09:19.706 ************************************ 00:09:19.706 START TEST version 00:09:19.706 ************************************ 00:09:19.706 13:21:38 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:09:19.706 * Looking for test storage... 00:09:19.706 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:09:19.706 13:21:38 -- app/version.sh@17 -- # get_header_version major 00:09:19.706 13:21:38 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:09:19.706 13:21:38 -- app/version.sh@14 -- # cut -f2 00:09:19.706 13:21:38 -- app/version.sh@14 -- # tr -d '"' 00:09:19.706 13:21:38 -- app/version.sh@17 -- # major=24 00:09:19.706 13:21:38 -- app/version.sh@18 -- # get_header_version minor 00:09:19.706 13:21:38 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:09:19.706 13:21:38 -- app/version.sh@14 -- # cut -f2 00:09:19.706 13:21:38 -- app/version.sh@14 -- # tr -d '"' 00:09:19.706 13:21:38 -- app/version.sh@18 -- # minor=1 00:09:19.706 13:21:38 -- app/version.sh@19 -- # get_header_version patch 00:09:19.706 13:21:38 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:09:19.706 13:21:38 -- app/version.sh@14 -- # cut -f2 00:09:19.706 13:21:38 -- app/version.sh@14 -- # tr -d '"' 00:09:19.706 13:21:38 -- app/version.sh@19 -- # patch=1 00:09:19.706 13:21:38 -- app/version.sh@20 -- # get_header_version suffix 00:09:19.706 13:21:38 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:09:19.706 13:21:38 -- app/version.sh@14 -- # cut -f2 00:09:19.706 13:21:38 -- app/version.sh@14 -- # tr -d '"' 00:09:19.706 13:21:38 -- app/version.sh@20 -- # suffix=-pre 00:09:19.706 13:21:38 -- app/version.sh@22 -- # version=24.1 00:09:19.706 13:21:38 -- app/version.sh@25 -- # (( patch != 0 )) 00:09:19.706 13:21:38 -- app/version.sh@25 -- # version=24.1.1 00:09:19.706 13:21:38 -- app/version.sh@28 -- # version=24.1.1rc0 00:09:19.706 13:21:38 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:09:19.706 13:21:38 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:09:19.706 13:21:38 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:09:19.706 13:21:38 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:09:19.706 00:09:19.706 real 0m0.192s 00:09:19.706 user 0m0.095s 00:09:19.706 sys 0m0.146s 00:09:19.706 13:21:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:19.706 13:21:38 -- common/autotest_common.sh@10 -- # set +x 00:09:19.706 ************************************ 00:09:19.706 END TEST version 00:09:19.706 ************************************ 00:09:19.966 13:21:38 -- spdk/autotest.sh@194 -- # '[' 0 -eq 1 ']' 00:09:19.966 13:21:38 -- spdk/autotest.sh@204 -- # uname -s 00:09:19.966 13:21:38 -- spdk/autotest.sh@204 -- # [[ Linux == Linux ]] 00:09:19.966 13:21:38 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:09:19.966 13:21:38 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:09:19.966 13:21:38 -- spdk/autotest.sh@217 -- # '[' 0 -eq 1 ']' 00:09:19.966 13:21:38 -- spdk/autotest.sh@264 -- # '[' 0 -eq 1 ']' 00:09:19.966 13:21:38 -- spdk/autotest.sh@268 -- # timing_exit lib 00:09:19.966 13:21:38 -- common/autotest_common.sh@718 -- # xtrace_disable 00:09:19.966 13:21:38 -- common/autotest_common.sh@10 -- # set +x 00:09:19.966 13:21:38 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:09:19.966 13:21:38 -- spdk/autotest.sh@278 -- # '[' 0 -eq 1 ']' 00:09:19.966 13:21:38 -- spdk/autotest.sh@287 -- # '[' 0 -eq 1 ']' 00:09:19.966 13:21:38 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:09:19.966 13:21:38 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:09:19.966 13:21:38 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:09:19.966 13:21:38 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:09:19.966 13:21:38 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:09:19.966 13:21:38 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:09:19.966 13:21:38 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:09:19.966 13:21:38 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:09:19.966 13:21:38 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:09:19.966 13:21:38 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:09:19.966 13:21:38 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:09:19.966 13:21:38 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:09:19.966 13:21:38 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:09:19.966 13:21:38 -- spdk/autotest.sh@374 -- # [[ 1 -eq 1 ]] 00:09:19.966 13:21:38 -- spdk/autotest.sh@375 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:09:19.966 13:21:38 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:09:19.966 13:21:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:19.966 13:21:38 -- common/autotest_common.sh@10 -- # set +x 00:09:19.966 ************************************ 00:09:19.966 START TEST llvm_fuzz 00:09:19.966 ************************************ 00:09:19.966 13:21:38 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:09:19.966 * Looking for test storage... 00:09:19.966 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:09:19.966 13:21:38 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:09:19.966 13:21:38 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:09:19.966 13:21:38 -- common/autotest_common.sh@538 -- # fuzzers=() 00:09:19.966 13:21:38 -- common/autotest_common.sh@538 -- # local fuzzers 00:09:19.966 13:21:38 -- common/autotest_common.sh@540 -- # [[ -n '' ]] 00:09:19.966 13:21:38 -- common/autotest_common.sh@543 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:09:19.966 13:21:38 -- common/autotest_common.sh@544 -- # fuzzers=("${fuzzers[@]##*/}") 00:09:19.966 13:21:38 -- common/autotest_common.sh@547 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:09:19.966 13:21:38 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:09:19.966 13:21:38 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:09:19.966 13:21:38 -- fuzz/llvm.sh@56 -- # [[ 1 -eq 0 ]] 00:09:19.966 13:21:38 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:09:19.966 13:21:38 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:09:19.966 13:21:38 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:09:19.966 13:21:38 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:09:19.966 13:21:38 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:09:19.966 13:21:38 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:09:19.966 13:21:38 -- fuzz/llvm.sh@62 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:09:19.966 13:21:38 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:09:19.967 13:21:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:19.967 13:21:38 -- common/autotest_common.sh@10 -- # set +x 00:09:19.967 ************************************ 00:09:19.967 START TEST nvmf_fuzz 00:09:19.967 ************************************ 00:09:19.967 13:21:38 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:09:20.228 * Looking for test storage... 00:09:20.228 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:09:20.228 13:21:38 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:09:20.228 13:21:38 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:09:20.228 13:21:38 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:09:20.228 13:21:38 -- common/autotest_common.sh@34 -- # set -e 00:09:20.228 13:21:38 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:09:20.228 13:21:38 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:09:20.228 13:21:38 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:09:20.228 13:21:38 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:09:20.228 13:21:38 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:09:20.228 13:21:38 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:09:20.228 13:21:38 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:09:20.228 13:21:38 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:09:20.228 13:21:38 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:09:20.228 13:21:38 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:09:20.228 13:21:38 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:09:20.228 13:21:38 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:09:20.228 13:21:38 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:09:20.228 13:21:38 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:09:20.228 13:21:38 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:09:20.228 13:21:38 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:09:20.228 13:21:38 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:09:20.228 13:21:38 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:09:20.228 13:21:38 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:09:20.228 13:21:38 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:09:20.228 13:21:38 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:09:20.228 13:21:38 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:09:20.228 13:21:38 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:09:20.228 13:21:38 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:09:20.228 13:21:38 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:09:20.228 13:21:38 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:09:20.228 13:21:38 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:09:20.228 13:21:38 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:09:20.228 13:21:38 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:09:20.228 13:21:38 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:09:20.228 13:21:38 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:09:20.228 13:21:38 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:09:20.228 13:21:38 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:09:20.228 13:21:38 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:09:20.228 13:21:38 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:09:20.228 13:21:38 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:09:20.228 13:21:38 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:09:20.228 13:21:38 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:09:20.228 13:21:38 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:09:20.228 13:21:38 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:20.228 13:21:38 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:09:20.228 13:21:38 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:09:20.228 13:21:38 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:09:20.228 13:21:38 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:09:20.228 13:21:38 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:09:20.228 13:21:38 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:09:20.228 13:21:38 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:09:20.228 13:21:38 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:09:20.228 13:21:38 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:09:20.228 13:21:38 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:09:20.228 13:21:38 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:09:20.228 13:21:38 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:09:20.228 13:21:38 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:09:20.228 13:21:38 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:09:20.228 13:21:38 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:09:20.228 13:21:38 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:09:20.228 13:21:38 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:09:20.228 13:21:38 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:09:20.228 13:21:38 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:09:20.228 13:21:38 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:09:20.228 13:21:38 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:09:20.228 13:21:38 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:09:20.228 13:21:38 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:09:20.228 13:21:38 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:09:20.228 13:21:38 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:20.228 13:21:38 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:09:20.228 13:21:38 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:09:20.228 13:21:38 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:09:20.228 13:21:38 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:09:20.228 13:21:38 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:09:20.228 13:21:38 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:09:20.228 13:21:38 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:09:20.228 13:21:38 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:09:20.228 13:21:38 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:09:20.228 13:21:38 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:09:20.228 13:21:38 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:09:20.228 13:21:38 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:09:20.228 13:21:38 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:09:20.228 13:21:38 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:09:20.228 13:21:38 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:09:20.228 13:21:38 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:09:20.228 13:21:38 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:09:20.228 13:21:38 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:09:20.228 13:21:38 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:09:20.228 13:21:38 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:09:20.228 13:21:38 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:09:20.228 13:21:38 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:09:20.228 13:21:38 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:20.228 13:21:38 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:09:20.228 13:21:38 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:09:20.228 13:21:38 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:09:20.228 13:21:38 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:09:20.228 13:21:38 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:09:20.228 13:21:38 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:09:20.228 13:21:38 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:09:20.228 13:21:38 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:09:20.228 13:21:38 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:09:20.228 13:21:38 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:09:20.228 13:21:38 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:09:20.228 #define SPDK_CONFIG_H 00:09:20.228 #define SPDK_CONFIG_APPS 1 00:09:20.228 #define SPDK_CONFIG_ARCH native 00:09:20.228 #undef SPDK_CONFIG_ASAN 00:09:20.228 #undef SPDK_CONFIG_AVAHI 00:09:20.228 #undef SPDK_CONFIG_CET 00:09:20.228 #define SPDK_CONFIG_COVERAGE 1 00:09:20.228 #define SPDK_CONFIG_CROSS_PREFIX 00:09:20.228 #undef SPDK_CONFIG_CRYPTO 00:09:20.228 #undef SPDK_CONFIG_CRYPTO_MLX5 00:09:20.228 #undef SPDK_CONFIG_CUSTOMOCF 00:09:20.228 #undef SPDK_CONFIG_DAOS 00:09:20.228 #define SPDK_CONFIG_DAOS_DIR 00:09:20.228 #define SPDK_CONFIG_DEBUG 1 00:09:20.228 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:09:20.228 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:20.228 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:09:20.229 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:20.229 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:09:20.229 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:09:20.229 #define SPDK_CONFIG_EXAMPLES 1 00:09:20.229 #undef SPDK_CONFIG_FC 00:09:20.229 #define SPDK_CONFIG_FC_PATH 00:09:20.229 #define SPDK_CONFIG_FIO_PLUGIN 1 00:09:20.229 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:09:20.229 #undef SPDK_CONFIG_FUSE 00:09:20.229 #define SPDK_CONFIG_FUZZER 1 00:09:20.229 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:09:20.229 #undef SPDK_CONFIG_GOLANG 00:09:20.229 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:09:20.229 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:09:20.229 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:09:20.229 #undef SPDK_CONFIG_HAVE_LIBBSD 00:09:20.229 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:09:20.229 #define SPDK_CONFIG_IDXD 1 00:09:20.229 #define SPDK_CONFIG_IDXD_KERNEL 1 00:09:20.229 #undef SPDK_CONFIG_IPSEC_MB 00:09:20.229 #define SPDK_CONFIG_IPSEC_MB_DIR 00:09:20.229 #define SPDK_CONFIG_ISAL 1 00:09:20.229 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:09:20.229 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:09:20.229 #define SPDK_CONFIG_LIBDIR 00:09:20.229 #undef SPDK_CONFIG_LTO 00:09:20.229 #define SPDK_CONFIG_MAX_LCORES 00:09:20.229 #define SPDK_CONFIG_NVME_CUSE 1 00:09:20.229 #undef SPDK_CONFIG_OCF 00:09:20.229 #define SPDK_CONFIG_OCF_PATH 00:09:20.229 #define SPDK_CONFIG_OPENSSL_PATH 00:09:20.229 #undef SPDK_CONFIG_PGO_CAPTURE 00:09:20.229 #undef SPDK_CONFIG_PGO_USE 00:09:20.229 #define SPDK_CONFIG_PREFIX /usr/local 00:09:20.229 #undef SPDK_CONFIG_RAID5F 00:09:20.229 #undef SPDK_CONFIG_RBD 00:09:20.229 #define SPDK_CONFIG_RDMA 1 00:09:20.229 #define SPDK_CONFIG_RDMA_PROV verbs 00:09:20.229 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:09:20.229 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:09:20.229 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:09:20.229 #undef SPDK_CONFIG_SHARED 00:09:20.229 #undef SPDK_CONFIG_SMA 00:09:20.229 #define SPDK_CONFIG_TESTS 1 00:09:20.229 #undef SPDK_CONFIG_TSAN 00:09:20.229 #define SPDK_CONFIG_UBLK 1 00:09:20.229 #define SPDK_CONFIG_UBSAN 1 00:09:20.229 #undef SPDK_CONFIG_UNIT_TESTS 00:09:20.229 #undef SPDK_CONFIG_URING 00:09:20.229 #define SPDK_CONFIG_URING_PATH 00:09:20.229 #undef SPDK_CONFIG_URING_ZNS 00:09:20.229 #undef SPDK_CONFIG_USDT 00:09:20.229 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:09:20.229 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:09:20.229 #define SPDK_CONFIG_VFIO_USER 1 00:09:20.229 #define SPDK_CONFIG_VFIO_USER_DIR 00:09:20.229 #define SPDK_CONFIG_VHOST 1 00:09:20.229 #define SPDK_CONFIG_VIRTIO 1 00:09:20.229 #undef SPDK_CONFIG_VTUNE 00:09:20.229 #define SPDK_CONFIG_VTUNE_DIR 00:09:20.229 #define SPDK_CONFIG_WERROR 1 00:09:20.229 #define SPDK_CONFIG_WPDK_DIR 00:09:20.229 #undef SPDK_CONFIG_XNVME 00:09:20.229 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:09:20.229 13:21:38 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:09:20.229 13:21:38 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:20.229 13:21:38 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:09:20.229 13:21:38 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:20.229 13:21:38 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:20.229 13:21:38 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:20.229 13:21:38 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:20.229 13:21:38 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:20.229 13:21:38 -- paths/export.sh@5 -- # export PATH 00:09:20.229 13:21:38 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:20.229 13:21:38 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:09:20.229 13:21:38 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:09:20.229 13:21:38 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:09:20.229 13:21:38 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:09:20.229 13:21:38 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:09:20.229 13:21:38 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:20.229 13:21:38 -- pm/common@16 -- # TEST_TAG=N/A 00:09:20.229 13:21:38 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:09:20.229 13:21:38 -- common/autotest_common.sh@52 -- # : 1 00:09:20.229 13:21:38 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:09:20.229 13:21:38 -- common/autotest_common.sh@56 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:09:20.229 13:21:38 -- common/autotest_common.sh@58 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:09:20.229 13:21:38 -- common/autotest_common.sh@60 -- # : 1 00:09:20.229 13:21:38 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:09:20.229 13:21:38 -- common/autotest_common.sh@62 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:09:20.229 13:21:38 -- common/autotest_common.sh@64 -- # : 00:09:20.229 13:21:38 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:09:20.229 13:21:38 -- common/autotest_common.sh@66 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:09:20.229 13:21:38 -- common/autotest_common.sh@68 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:09:20.229 13:21:38 -- common/autotest_common.sh@70 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:09:20.229 13:21:38 -- common/autotest_common.sh@72 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:09:20.229 13:21:38 -- common/autotest_common.sh@74 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:09:20.229 13:21:38 -- common/autotest_common.sh@76 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:09:20.229 13:21:38 -- common/autotest_common.sh@78 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:09:20.229 13:21:38 -- common/autotest_common.sh@80 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:09:20.229 13:21:38 -- common/autotest_common.sh@82 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:09:20.229 13:21:38 -- common/autotest_common.sh@84 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:09:20.229 13:21:38 -- common/autotest_common.sh@86 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:09:20.229 13:21:38 -- common/autotest_common.sh@88 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:09:20.229 13:21:38 -- common/autotest_common.sh@90 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:09:20.229 13:21:38 -- common/autotest_common.sh@92 -- # : 1 00:09:20.229 13:21:38 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:09:20.229 13:21:38 -- common/autotest_common.sh@94 -- # : 1 00:09:20.229 13:21:38 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:09:20.229 13:21:38 -- common/autotest_common.sh@96 -- # : rdma 00:09:20.229 13:21:38 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:09:20.229 13:21:38 -- common/autotest_common.sh@98 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:09:20.229 13:21:38 -- common/autotest_common.sh@100 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:09:20.229 13:21:38 -- common/autotest_common.sh@102 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:09:20.229 13:21:38 -- common/autotest_common.sh@104 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:09:20.229 13:21:38 -- common/autotest_common.sh@106 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:09:20.229 13:21:38 -- common/autotest_common.sh@108 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:09:20.229 13:21:38 -- common/autotest_common.sh@110 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:09:20.229 13:21:38 -- common/autotest_common.sh@112 -- # : 0 00:09:20.229 13:21:38 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:09:20.230 13:21:38 -- common/autotest_common.sh@114 -- # : 0 00:09:20.230 13:21:38 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:09:20.230 13:21:38 -- common/autotest_common.sh@116 -- # : 1 00:09:20.230 13:21:38 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:09:20.230 13:21:38 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:20.230 13:21:38 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:09:20.230 13:21:38 -- common/autotest_common.sh@120 -- # : 0 00:09:20.230 13:21:38 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:09:20.230 13:21:38 -- common/autotest_common.sh@122 -- # : 0 00:09:20.230 13:21:38 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:09:20.230 13:21:38 -- common/autotest_common.sh@124 -- # : 0 00:09:20.230 13:21:38 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:09:20.230 13:21:38 -- common/autotest_common.sh@126 -- # : 0 00:09:20.230 13:21:38 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:09:20.230 13:21:38 -- common/autotest_common.sh@128 -- # : 0 00:09:20.230 13:21:38 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:09:20.230 13:21:38 -- common/autotest_common.sh@130 -- # : 0 00:09:20.230 13:21:38 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:09:20.230 13:21:38 -- common/autotest_common.sh@132 -- # : v23.11 00:09:20.230 13:21:38 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:09:20.230 13:21:38 -- common/autotest_common.sh@134 -- # : true 00:09:20.230 13:21:38 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:09:20.230 13:21:38 -- common/autotest_common.sh@136 -- # : 0 00:09:20.230 13:21:38 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:09:20.230 13:21:38 -- common/autotest_common.sh@138 -- # : 0 00:09:20.230 13:21:38 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:09:20.230 13:21:38 -- common/autotest_common.sh@140 -- # : 0 00:09:20.230 13:21:38 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:09:20.230 13:21:38 -- common/autotest_common.sh@142 -- # : 0 00:09:20.230 13:21:38 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:09:20.230 13:21:38 -- common/autotest_common.sh@144 -- # : 0 00:09:20.230 13:21:38 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:09:20.230 13:21:38 -- common/autotest_common.sh@146 -- # : 0 00:09:20.230 13:21:38 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:09:20.230 13:21:38 -- common/autotest_common.sh@148 -- # : 00:09:20.230 13:21:38 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:09:20.230 13:21:38 -- common/autotest_common.sh@150 -- # : 0 00:09:20.230 13:21:38 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:09:20.230 13:21:38 -- common/autotest_common.sh@152 -- # : 0 00:09:20.230 13:21:38 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:09:20.230 13:21:38 -- common/autotest_common.sh@154 -- # : 0 00:09:20.230 13:21:38 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:09:20.230 13:21:38 -- common/autotest_common.sh@156 -- # : 0 00:09:20.230 13:21:38 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:09:20.230 13:21:38 -- common/autotest_common.sh@158 -- # : 0 00:09:20.230 13:21:38 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:09:20.230 13:21:38 -- common/autotest_common.sh@160 -- # : 0 00:09:20.230 13:21:38 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:09:20.230 13:21:38 -- common/autotest_common.sh@163 -- # : 00:09:20.230 13:21:38 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:09:20.230 13:21:38 -- common/autotest_common.sh@165 -- # : 0 00:09:20.230 13:21:38 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:09:20.230 13:21:38 -- common/autotest_common.sh@167 -- # : 0 00:09:20.230 13:21:38 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:09:20.230 13:21:38 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:09:20.230 13:21:38 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:09:20.230 13:21:38 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:20.230 13:21:38 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:20.230 13:21:38 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:20.230 13:21:38 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:20.230 13:21:38 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:20.230 13:21:38 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:20.230 13:21:38 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:09:20.230 13:21:38 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:09:20.230 13:21:38 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:09:20.230 13:21:38 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:09:20.230 13:21:38 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:09:20.230 13:21:38 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:09:20.230 13:21:38 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:09:20.230 13:21:38 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:09:20.230 13:21:38 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:09:20.230 13:21:38 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:09:20.230 13:21:38 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:09:20.230 13:21:38 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:09:20.230 13:21:38 -- common/autotest_common.sh@196 -- # cat 00:09:20.230 13:21:38 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:09:20.230 13:21:38 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:09:20.230 13:21:38 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:09:20.230 13:21:38 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:09:20.230 13:21:38 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:09:20.230 13:21:38 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:09:20.230 13:21:38 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:09:20.230 13:21:38 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:09:20.230 13:21:38 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:09:20.230 13:21:38 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:09:20.230 13:21:38 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:09:20.230 13:21:38 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:09:20.230 13:21:38 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:09:20.230 13:21:38 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:09:20.230 13:21:38 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:09:20.230 13:21:38 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:09:20.230 13:21:38 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:09:20.230 13:21:38 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:09:20.230 13:21:38 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:09:20.230 13:21:38 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:09:20.230 13:21:38 -- common/autotest_common.sh@249 -- # export valgrind= 00:09:20.230 13:21:38 -- common/autotest_common.sh@249 -- # valgrind= 00:09:20.230 13:21:38 -- common/autotest_common.sh@255 -- # uname -s 00:09:20.231 13:21:38 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:09:20.231 13:21:38 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:09:20.231 13:21:38 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:09:20.231 13:21:38 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:09:20.231 13:21:38 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:09:20.231 13:21:38 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:09:20.231 13:21:38 -- common/autotest_common.sh@265 -- # MAKE=make 00:09:20.231 13:21:38 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j72 00:09:20.231 13:21:38 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:09:20.231 13:21:38 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:09:20.231 13:21:38 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:09:20.231 13:21:38 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:09:20.231 13:21:38 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:09:20.231 13:21:38 -- common/autotest_common.sh@309 -- # [[ -z 3167113 ]] 00:09:20.231 13:21:38 -- common/autotest_common.sh@309 -- # kill -0 3167113 00:09:20.231 13:21:38 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:09:20.231 13:21:38 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:09:20.231 13:21:38 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:09:20.231 13:21:38 -- common/autotest_common.sh@322 -- # local mount target_dir 00:09:20.231 13:21:38 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:09:20.231 13:21:38 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:09:20.231 13:21:38 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:09:20.231 13:21:38 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:09:20.231 13:21:38 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.tnEzvZ 00:09:20.231 13:21:38 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:09:20.231 13:21:38 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:09:20.231 13:21:38 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:09:20.231 13:21:38 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.tnEzvZ/tests/nvmf /tmp/spdk.tnEzvZ 00:09:20.231 13:21:38 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:09:20.231 13:21:38 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:20.231 13:21:38 -- common/autotest_common.sh@318 -- # df -T 00:09:20.231 13:21:38 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:09:20.231 13:21:38 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:09:20.231 13:21:38 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:09:20.231 13:21:38 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:09:20.231 13:21:38 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:09:20.231 13:21:38 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:09:20.231 13:21:38 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:20.231 13:21:38 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:09:20.231 13:21:38 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:09:20.231 13:21:38 -- common/autotest_common.sh@353 -- # avails["$mount"]=893108224 00:09:20.231 13:21:38 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:09:20.231 13:21:38 -- common/autotest_common.sh@354 -- # uses["$mount"]=4391321600 00:09:20.231 13:21:38 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:20.231 13:21:38 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:09:20.231 13:21:38 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:09:20.231 13:21:38 -- common/autotest_common.sh@353 -- # avails["$mount"]=80274530304 00:09:20.231 13:21:38 -- common/autotest_common.sh@353 -- # sizes["$mount"]=94508572672 00:09:20.231 13:21:38 -- common/autotest_common.sh@354 -- # uses["$mount"]=14234042368 00:09:20.231 13:21:38 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:20.231 13:21:38 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:09:20.231 13:21:38 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:09:20.231 13:21:38 -- common/autotest_common.sh@353 -- # avails["$mount"]=47200768000 00:09:20.231 13:21:38 -- common/autotest_common.sh@353 -- # sizes["$mount"]=47254286336 00:09:20.231 13:21:38 -- common/autotest_common.sh@354 -- # uses["$mount"]=53518336 00:09:20.231 13:21:38 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:20.231 13:21:38 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:09:20.231 13:21:38 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:09:20.231 13:21:38 -- common/autotest_common.sh@353 -- # avails["$mount"]=18895626240 00:09:20.231 13:21:38 -- common/autotest_common.sh@353 -- # sizes["$mount"]=18901716992 00:09:20.231 13:21:38 -- common/autotest_common.sh@354 -- # uses["$mount"]=6090752 00:09:20.231 13:21:38 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:20.231 13:21:38 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:09:20.231 13:21:38 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:09:20.231 13:21:38 -- common/autotest_common.sh@353 -- # avails["$mount"]=47252979712 00:09:20.231 13:21:38 -- common/autotest_common.sh@353 -- # sizes["$mount"]=47254286336 00:09:20.231 13:21:38 -- common/autotest_common.sh@354 -- # uses["$mount"]=1306624 00:09:20.231 13:21:38 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:20.231 13:21:38 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:09:20.231 13:21:38 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:09:20.231 13:21:38 -- common/autotest_common.sh@353 -- # avails["$mount"]=9450852352 00:09:20.231 13:21:38 -- common/autotest_common.sh@353 -- # sizes["$mount"]=9450856448 00:09:20.231 13:21:38 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:09:20.231 13:21:38 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:20.231 13:21:38 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:09:20.231 * Looking for test storage... 00:09:20.231 13:21:38 -- common/autotest_common.sh@359 -- # local target_space new_size 00:09:20.231 13:21:38 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:09:20.231 13:21:38 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:09:20.231 13:21:38 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:09:20.231 13:21:38 -- common/autotest_common.sh@363 -- # mount=/ 00:09:20.231 13:21:38 -- common/autotest_common.sh@365 -- # target_space=80274530304 00:09:20.231 13:21:38 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:09:20.231 13:21:38 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:09:20.231 13:21:38 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:09:20.231 13:21:38 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:09:20.231 13:21:38 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:09:20.231 13:21:38 -- common/autotest_common.sh@372 -- # new_size=16448634880 00:09:20.231 13:21:38 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:09:20.231 13:21:38 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:09:20.231 13:21:38 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:09:20.231 13:21:39 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:09:20.231 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:09:20.231 13:21:39 -- common/autotest_common.sh@380 -- # return 0 00:09:20.231 13:21:39 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:09:20.231 13:21:39 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:09:20.231 13:21:39 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:09:20.231 13:21:39 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:09:20.231 13:21:39 -- common/autotest_common.sh@1672 -- # true 00:09:20.231 13:21:39 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:09:20.231 13:21:39 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:09:20.231 13:21:39 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:09:20.231 13:21:39 -- common/autotest_common.sh@27 -- # exec 00:09:20.231 13:21:39 -- common/autotest_common.sh@29 -- # exec 00:09:20.231 13:21:39 -- common/autotest_common.sh@31 -- # xtrace_restore 00:09:20.231 13:21:39 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:09:20.231 13:21:39 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:09:20.231 13:21:39 -- common/autotest_common.sh@18 -- # set -x 00:09:20.231 13:21:39 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:09:20.231 13:21:39 -- ../common.sh@8 -- # pids=() 00:09:20.231 13:21:39 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:09:20.231 13:21:39 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:09:20.231 13:21:39 -- nvmf/run.sh@56 -- # fuzz_num=25 00:09:20.231 13:21:39 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:09:20.231 13:21:39 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:09:20.231 13:21:39 -- nvmf/run.sh@61 -- # mem_size=512 00:09:20.231 13:21:39 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:09:20.231 13:21:39 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:09:20.231 13:21:39 -- ../common.sh@69 -- # local fuzz_num=25 00:09:20.231 13:21:39 -- ../common.sh@70 -- # local time=1 00:09:20.231 13:21:39 -- ../common.sh@72 -- # (( i = 0 )) 00:09:20.231 13:21:39 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:20.231 13:21:39 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:09:20.231 13:21:39 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:09:20.231 13:21:39 -- nvmf/run.sh@24 -- # local timen=1 00:09:20.231 13:21:39 -- nvmf/run.sh@25 -- # local core=0x1 00:09:20.231 13:21:39 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:09:20.232 13:21:39 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:09:20.232 13:21:39 -- nvmf/run.sh@29 -- # printf %02d 0 00:09:20.232 13:21:39 -- nvmf/run.sh@29 -- # port=4400 00:09:20.232 13:21:39 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:09:20.232 13:21:39 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:09:20.232 13:21:39 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:20.232 13:21:39 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:09:20.232 [2024-07-24 13:21:39.060864] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:20.232 [2024-07-24 13:21:39.060945] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3167157 ] 00:09:20.490 EAL: No free 2048 kB hugepages reported on node 1 00:09:20.749 [2024-07-24 13:21:39.385232] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:20.749 [2024-07-24 13:21:39.417701] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:20.749 [2024-07-24 13:21:39.417882] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:20.749 [2024-07-24 13:21:39.472646] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:20.749 [2024-07-24 13:21:39.488887] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:09:20.749 INFO: Running with entropic power schedule (0xFF, 100). 00:09:20.749 INFO: Seed: 104373310 00:09:20.749 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:09:20.749 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:09:20.749 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:09:20.749 INFO: A corpus is not provided, starting from an empty corpus 00:09:20.749 #2 INITED exec/s: 0 rss: 61Mb 00:09:20.749 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:20.749 This may also happen if the target rejected all inputs we tried so far 00:09:20.749 [2024-07-24 13:21:39.544424] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:20.749 [2024-07-24 13:21:39.544463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.354 NEW_FUNC[1/667]: 0x49e700 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:09:21.354 NEW_FUNC[2/667]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:21.354 #10 NEW cov: 11448 ft: 11449 corp: 2/80b lim: 320 exec/s: 0 rss: 68Mb L: 79/79 MS: 3 CrossOver-CrossOver-InsertRepeatedBytes- 00:09:21.354 [2024-07-24 13:21:40.015722] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:09:21.354 [2024-07-24 13:21:40.015779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.354 NEW_FUNC[1/3]: 0x1555260 in nvme_ctrlr_process_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ctrlr.c:3790 00:09:21.354 NEW_FUNC[2/3]: 0x17220d0 in spdk_nvme_probe_poll_async /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme.c:1507 00:09:21.354 #11 NEW cov: 11585 ft: 11861 corp: 3/200b lim: 320 exec/s: 0 rss: 68Mb L: 120/120 MS: 1 InsertRepeatedBytes- 00:09:21.354 [2024-07-24 13:21:40.085907] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:21.354 [2024-07-24 13:21:40.085955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.354 #17 NEW cov: 11591 ft: 12209 corp: 4/280b lim: 320 exec/s: 0 rss: 68Mb L: 80/120 MS: 1 InsertByte- 00:09:21.354 [2024-07-24 13:21:40.135995] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:21.354 [2024-07-24 13:21:40.136035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.354 #18 NEW cov: 11676 ft: 12436 corp: 5/385b lim: 320 exec/s: 0 rss: 69Mb L: 105/120 MS: 1 CrossOver- 00:09:21.354 [2024-07-24 13:21:40.196116] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:21.354 [2024-07-24 13:21:40.196153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.613 #19 NEW cov: 11676 ft: 12533 corp: 6/457b lim: 320 exec/s: 0 rss: 69Mb L: 72/120 MS: 1 EraseBytes- 00:09:21.613 [2024-07-24 13:21:40.236268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:34340000 00:09:21.613 [2024-07-24 13:21:40.236304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.613 #22 NEW cov: 11699 ft: 12666 corp: 7/566b lim: 320 exec/s: 0 rss: 69Mb L: 109/120 MS: 3 CrossOver-ChangeBinInt-InsertRepeatedBytes- 00:09:21.613 [2024-07-24 13:21:40.286457] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:21.613 [2024-07-24 13:21:40.286493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.613 #23 NEW cov: 11699 ft: 12769 corp: 8/646b lim: 320 exec/s: 0 rss: 69Mb L: 80/120 MS: 1 ChangeByte- 00:09:21.613 [2024-07-24 13:21:40.336566] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1eff 00:09:21.613 [2024-07-24 13:21:40.336602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.613 #24 NEW cov: 11699 ft: 12808 corp: 9/751b lim: 320 exec/s: 0 rss: 69Mb L: 105/120 MS: 1 CMP- DE: "\377\377\377\036"- 00:09:21.613 [2024-07-24 13:21:40.396720] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x84 00:09:21.613 [2024-07-24 13:21:40.396756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.613 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:21.613 #25 NEW cov: 11722 ft: 12836 corp: 10/823b lim: 320 exec/s: 0 rss: 69Mb L: 72/120 MS: 1 ChangeByte- 00:09:21.613 [2024-07-24 13:21:40.456912] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:6c6c6c6c SGL TRANSPORT DATA BLOCK TRANSPORT 0x6c6c6c6c6c6c6c6c 00:09:21.613 [2024-07-24 13:21:40.456950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.872 #26 NEW cov: 11722 ft: 12869 corp: 11/890b lim: 320 exec/s: 0 rss: 69Mb L: 67/120 MS: 1 InsertRepeatedBytes- 00:09:21.872 [2024-07-24 13:21:40.497223] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:21.872 [2024-07-24 13:21:40.497258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.872 [2024-07-24 13:21:40.497334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:09:21.872 [2024-07-24 13:21:40.497356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:21.872 [2024-07-24 13:21:40.497428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:21.872 [2024-07-24 13:21:40.497452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:21.872 NEW_FUNC[1/1]: 0x12fe060 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2014 00:09:21.872 #27 NEW cov: 11753 ft: 13155 corp: 12/1088b lim: 320 exec/s: 27 rss: 69Mb L: 198/198 MS: 1 InsertRepeatedBytes- 00:09:21.872 [2024-07-24 13:21:40.547256] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:21.872 [2024-07-24 13:21:40.547297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.872 [2024-07-24 13:21:40.547379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:09:21.872 [2024-07-24 13:21:40.547404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:21.872 #28 NEW cov: 11753 ft: 13350 corp: 13/1269b lim: 320 exec/s: 28 rss: 69Mb L: 181/198 MS: 1 EraseBytes- 00:09:21.872 [2024-07-24 13:21:40.607354] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:f6ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x1eff 00:09:21.872 [2024-07-24 13:21:40.607391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.872 #29 NEW cov: 11753 ft: 13390 corp: 14/1374b lim: 320 exec/s: 29 rss: 69Mb L: 105/198 MS: 1 ChangeBinInt- 00:09:21.872 [2024-07-24 13:21:40.667493] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:09:21.872 [2024-07-24 13:21:40.667529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.872 #30 NEW cov: 11753 ft: 13429 corp: 15/1478b lim: 320 exec/s: 30 rss: 69Mb L: 104/198 MS: 1 EraseBytes- 00:09:21.872 [2024-07-24 13:21:40.727703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xa0affffffff 00:09:21.872 [2024-07-24 13:21:40.727738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.872 [2024-07-24 13:21:40.727806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:21.872 [2024-07-24 13:21:40.727826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:22.130 #31 NEW cov: 11753 ft: 13475 corp: 16/1611b lim: 320 exec/s: 31 rss: 69Mb L: 133/198 MS: 1 InsertRepeatedBytes- 00:09:22.130 [2024-07-24 13:21:40.777833] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:22.130 [2024-07-24 13:21:40.777869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.130 #32 NEW cov: 11753 ft: 13483 corp: 17/1692b lim: 320 exec/s: 32 rss: 70Mb L: 81/198 MS: 1 InsertByte- 00:09:22.130 [2024-07-24 13:21:40.838073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xa0affffffff 00:09:22.131 [2024-07-24 13:21:40.838109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.131 [2024-07-24 13:21:40.838178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:22.131 [2024-07-24 13:21:40.838198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:22.131 #33 NEW cov: 11753 ft: 13517 corp: 18/1825b lim: 320 exec/s: 33 rss: 70Mb L: 133/198 MS: 1 ChangeBit- 00:09:22.131 [2024-07-24 13:21:40.898177] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:22.131 [2024-07-24 13:21:40.898220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.131 #34 NEW cov: 11753 ft: 13529 corp: 19/1906b lim: 320 exec/s: 34 rss: 70Mb L: 81/198 MS: 1 ChangeBinInt- 00:09:22.131 [2024-07-24 13:21:40.958478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xa0affffffff 00:09:22.131 [2024-07-24 13:21:40.958513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.131 [2024-07-24 13:21:40.958581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:22.131 [2024-07-24 13:21:40.958601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:22.131 #40 NEW cov: 11753 ft: 13539 corp: 20/2039b lim: 320 exec/s: 40 rss: 70Mb L: 133/198 MS: 1 ShuffleBytes- 00:09:22.389 [2024-07-24 13:21:41.008486] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:22.389 [2024-07-24 13:21:41.008524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.389 #41 NEW cov: 11753 ft: 13549 corp: 21/2119b lim: 320 exec/s: 41 rss: 70Mb L: 80/198 MS: 1 ChangeBit- 00:09:22.389 [2024-07-24 13:21:41.058596] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:22.389 [2024-07-24 13:21:41.058633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.389 #42 NEW cov: 11753 ft: 13610 corp: 22/2195b lim: 320 exec/s: 42 rss: 70Mb L: 76/198 MS: 1 CMP- DE: "\010\000\000\000"- 00:09:22.389 [2024-07-24 13:21:41.108845] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00660000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x2b0000f6185c018b 00:09:22.389 [2024-07-24 13:21:41.108881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.389 #43 NEW cov: 11753 ft: 13627 corp: 23/2284b lim: 320 exec/s: 43 rss: 70Mb L: 89/198 MS: 1 CMP- DE: "\377,\347\213\001\\\030\366"- 00:09:22.389 [2024-07-24 13:21:41.169065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xa0affffffff 00:09:22.389 [2024-07-24 13:21:41.169101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.389 [2024-07-24 13:21:41.169166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:22.389 [2024-07-24 13:21:41.169187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:22.390 #44 NEW cov: 11753 ft: 13635 corp: 24/2421b lim: 320 exec/s: 44 rss: 70Mb L: 137/198 MS: 1 PersAutoDict- DE: "\377\377\377\036"- 00:09:22.390 [2024-07-24 13:21:41.209101] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffd1 00:09:22.390 [2024-07-24 13:21:41.209137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.390 #45 NEW cov: 11753 ft: 13658 corp: 25/2525b lim: 320 exec/s: 45 rss: 70Mb L: 104/198 MS: 1 ChangeByte- 00:09:22.649 [2024-07-24 13:21:41.269445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:34340000 00:09:22.649 [2024-07-24 13:21:41.269480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.649 [2024-07-24 13:21:41.269550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:34343434 cdw11:34343434 00:09:22.649 [2024-07-24 13:21:41.269570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:22.649 #46 NEW cov: 11753 ft: 13719 corp: 26/2677b lim: 320 exec/s: 46 rss: 70Mb L: 152/198 MS: 1 CrossOver- 00:09:22.649 [2024-07-24 13:21:41.329433] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:09:22.649 [2024-07-24 13:21:41.329469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.649 #47 NEW cov: 11753 ft: 13754 corp: 27/2797b lim: 320 exec/s: 47 rss: 70Mb L: 120/198 MS: 1 ShuffleBytes- 00:09:22.649 [2024-07-24 13:21:41.389622] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:22.649 [2024-07-24 13:21:41.389658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.649 #48 NEW cov: 11753 ft: 13762 corp: 28/2876b lim: 320 exec/s: 48 rss: 70Mb L: 79/198 MS: 1 ChangeByte- 00:09:22.649 [2024-07-24 13:21:41.429651] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:22.649 [2024-07-24 13:21:41.429687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.649 #49 NEW cov: 11753 ft: 13800 corp: 29/2961b lim: 320 exec/s: 49 rss: 70Mb L: 85/198 MS: 1 PersAutoDict- DE: "\377\377\377\036"- 00:09:22.649 [2024-07-24 13:21:41.479889] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:22.649 [2024-07-24 13:21:41.479924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.649 #50 NEW cov: 11753 ft: 13813 corp: 30/3033b lim: 320 exec/s: 50 rss: 70Mb L: 72/198 MS: 1 ChangeBinInt- 00:09:22.909 [2024-07-24 13:21:41.519937] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:09:22.909 [2024-07-24 13:21:41.519973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.909 #51 NEW cov: 11753 ft: 13815 corp: 31/3154b lim: 320 exec/s: 25 rss: 70Mb L: 121/198 MS: 1 CopyPart- 00:09:22.909 #51 DONE cov: 11753 ft: 13815 corp: 31/3154b lim: 320 exec/s: 25 rss: 70Mb 00:09:22.909 ###### Recommended dictionary. ###### 00:09:22.909 "\377\377\377\036" # Uses: 3 00:09:22.909 "\010\000\000\000" # Uses: 0 00:09:22.909 "\377,\347\213\001\\\030\366" # Uses: 0 00:09:22.909 ###### End of recommended dictionary. ###### 00:09:22.909 Done 51 runs in 2 second(s) 00:09:22.909 13:21:41 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:09:22.909 13:21:41 -- ../common.sh@72 -- # (( i++ )) 00:09:22.909 13:21:41 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:22.909 13:21:41 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:09:22.909 13:21:41 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:09:22.909 13:21:41 -- nvmf/run.sh@24 -- # local timen=1 00:09:22.909 13:21:41 -- nvmf/run.sh@25 -- # local core=0x1 00:09:22.909 13:21:41 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:09:22.909 13:21:41 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:09:22.909 13:21:41 -- nvmf/run.sh@29 -- # printf %02d 1 00:09:22.909 13:21:41 -- nvmf/run.sh@29 -- # port=4401 00:09:22.909 13:21:41 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:09:22.909 13:21:41 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:09:22.909 13:21:41 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:22.909 13:21:41 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:09:22.909 [2024-07-24 13:21:41.742860] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:22.909 [2024-07-24 13:21:41.742938] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3167524 ] 00:09:23.167 EAL: No free 2048 kB hugepages reported on node 1 00:09:23.426 [2024-07-24 13:21:42.114962] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:23.426 [2024-07-24 13:21:42.148467] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:23.426 [2024-07-24 13:21:42.148645] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:23.426 [2024-07-24 13:21:42.203351] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:23.426 [2024-07-24 13:21:42.219595] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:09:23.426 INFO: Running with entropic power schedule (0xFF, 100). 00:09:23.426 INFO: Seed: 2836388085 00:09:23.426 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:09:23.426 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:09:23.426 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:09:23.426 INFO: A corpus is not provided, starting from an empty corpus 00:09:23.426 #2 INITED exec/s: 0 rss: 61Mb 00:09:23.426 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:23.426 This may also happen if the target rejected all inputs we tried so far 00:09:23.685 [2024-07-24 13:21:42.296333] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000aeae 00:09:23.685 [2024-07-24 13:21:42.296852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0abf02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:23.685 [2024-07-24 13:21:42.296901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:23.943 NEW_FUNC[1/668]: 0x49f000 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:09:23.943 NEW_FUNC[2/668]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:23.943 #10 NEW cov: 11512 ft: 11554 corp: 2/11b lim: 30 exec/s: 0 rss: 68Mb L: 10/10 MS: 3 InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:09:23.943 [2024-07-24 13:21:42.627459] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10804) > buf size (4096) 00:09:23.943 [2024-07-24 13:21:42.627767] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (143924) > buf size (4096) 00:09:23.943 [2024-07-24 13:21:42.628061] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (143924) > buf size (4096) 00:09:23.943 [2024-07-24 13:21:42.628566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a8c008c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:23.943 [2024-07-24 13:21:42.628628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:23.943 [2024-07-24 13:21:42.628757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8c8c008c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:23.943 [2024-07-24 13:21:42.628787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:23.943 [2024-07-24 13:21:42.628898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8c8c008c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:23.943 [2024-07-24 13:21:42.628927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:23.943 NEW_FUNC[1/3]: 0x1975690 in event_queue_run_batch /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:528 00:09:23.943 NEW_FUNC[2/3]: 0x1976de0 in reactor_post_process_lw_thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:864 00:09:23.943 #13 NEW cov: 11689 ft: 12494 corp: 3/29b lim: 30 exec/s: 0 rss: 68Mb L: 18/18 MS: 3 CrossOver-CopyPart-InsertRepeatedBytes- 00:09:23.943 [2024-07-24 13:21:42.687535] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000aeae 00:09:23.943 [2024-07-24 13:21:42.688031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0abf02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:23.943 [2024-07-24 13:21:42.688061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:23.943 #14 NEW cov: 11695 ft: 12798 corp: 4/39b lim: 30 exec/s: 0 rss: 69Mb L: 10/18 MS: 1 ShuffleBytes- 00:09:23.943 [2024-07-24 13:21:42.748045] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000aeae 00:09:23.943 [2024-07-24 13:21:42.748523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0abf02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:23.943 [2024-07-24 13:21:42.748550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:23.943 #15 NEW cov: 11780 ft: 13138 corp: 5/49b lim: 30 exec/s: 0 rss: 69Mb L: 10/18 MS: 1 ChangeBinInt- 00:09:23.943 [2024-07-24 13:21:42.798469] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004a51 00:09:23.943 [2024-07-24 13:21:42.798972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0abf02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:23.943 [2024-07-24 13:21:42.799002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.202 #16 NEW cov: 11780 ft: 13320 corp: 6/59b lim: 30 exec/s: 0 rss: 69Mb L: 10/18 MS: 1 ChangeBinInt- 00:09:24.202 [2024-07-24 13:21:42.848813] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000aeae 00:09:24.202 [2024-07-24 13:21:42.849320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0abf02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.202 [2024-07-24 13:21:42.849348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.202 #17 NEW cov: 11780 ft: 13408 corp: 7/70b lim: 30 exec/s: 0 rss: 69Mb L: 11/18 MS: 1 InsertByte- 00:09:24.202 [2024-07-24 13:21:42.899551] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (39980) > buf size (4096) 00:09:24.202 [2024-07-24 13:21:42.901132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:270a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.202 [2024-07-24 13:21:42.901161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.202 [2024-07-24 13:21:42.901249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.202 [2024-07-24 13:21:42.901266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.202 [2024-07-24 13:21:42.901356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.202 [2024-07-24 13:21:42.901372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:24.202 [2024-07-24 13:21:42.901466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.202 [2024-07-24 13:21:42.901485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:24.202 [2024-07-24 13:21:42.901572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.202 [2024-07-24 13:21:42.901588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:24.202 #20 NEW cov: 11797 ft: 14059 corp: 8/100b lim: 30 exec/s: 0 rss: 69Mb L: 30/30 MS: 3 ShuffleBytes-InsertByte-InsertRepeatedBytes- 00:09:24.202 [2024-07-24 13:21:42.959648] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10804) > buf size (4096) 00:09:24.202 [2024-07-24 13:21:42.959950] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200008cae 00:09:24.202 [2024-07-24 13:21:42.960475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a8c008c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.202 [2024-07-24 13:21:42.960503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.202 [2024-07-24 13:21:42.960594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0abf02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.202 [2024-07-24 13:21:42.960610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.202 #21 NEW cov: 11797 ft: 14336 corp: 9/112b lim: 30 exec/s: 0 rss: 69Mb L: 12/30 MS: 1 CrossOver- 00:09:24.202 [2024-07-24 13:21:43.020092] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000aeae 00:09:24.202 [2024-07-24 13:21:43.020397] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000aee9 00:09:24.202 [2024-07-24 13:21:43.020909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0abf02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.202 [2024-07-24 13:21:43.020936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.202 [2024-07-24 13:21:43.021033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffae02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.202 [2024-07-24 13:21:43.021049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.202 #22 NEW cov: 11797 ft: 14390 corp: 10/124b lim: 30 exec/s: 0 rss: 69Mb L: 12/30 MS: 1 InsertByte- 00:09:24.461 [2024-07-24 13:21:43.080449] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000aeae 00:09:24.461 [2024-07-24 13:21:43.080736] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000a6e9 00:09:24.461 [2024-07-24 13:21:43.081278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0abf02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.461 [2024-07-24 13:21:43.081306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.461 [2024-07-24 13:21:43.081400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffae02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.461 [2024-07-24 13:21:43.081416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.461 #23 NEW cov: 11797 ft: 14407 corp: 11/136b lim: 30 exec/s: 0 rss: 69Mb L: 12/30 MS: 1 ChangeBit- 00:09:24.461 [2024-07-24 13:21:43.140848] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200008cae 00:09:24.461 [2024-07-24 13:21:43.141349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0abf02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.461 [2024-07-24 13:21:43.141378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.461 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:24.461 #24 NEW cov: 11820 ft: 14474 corp: 12/142b lim: 30 exec/s: 0 rss: 69Mb L: 6/30 MS: 1 EraseBytes- 00:09:24.461 [2024-07-24 13:21:43.201185] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10804) > buf size (4096) 00:09:24.461 [2024-07-24 13:21:43.201498] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200008cae 00:09:24.462 [2024-07-24 13:21:43.202012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a8c008c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.462 [2024-07-24 13:21:43.202041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.462 [2024-07-24 13:21:43.202134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0abf02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.462 [2024-07-24 13:21:43.202150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.462 #25 NEW cov: 11820 ft: 14546 corp: 13/154b lim: 30 exec/s: 0 rss: 69Mb L: 12/30 MS: 1 ShuffleBytes- 00:09:24.462 [2024-07-24 13:21:43.251317] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000aeae 00:09:24.462 [2024-07-24 13:21:43.251594] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000a6e9 00:09:24.462 [2024-07-24 13:21:43.252154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0abf02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.462 [2024-07-24 13:21:43.252184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.462 [2024-07-24 13:21:43.252263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffae02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.462 [2024-07-24 13:21:43.252279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.462 #26 NEW cov: 11820 ft: 14574 corp: 14/166b lim: 30 exec/s: 26 rss: 69Mb L: 12/30 MS: 1 ChangeBit- 00:09:24.462 [2024-07-24 13:21:43.311615] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000aeae 00:09:24.462 [2024-07-24 13:21:43.311894] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000a6e9 00:09:24.462 [2024-07-24 13:21:43.312429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0abf02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.462 [2024-07-24 13:21:43.312459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.462 [2024-07-24 13:21:43.312550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:f7ae02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.462 [2024-07-24 13:21:43.312565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.720 #27 NEW cov: 11820 ft: 14595 corp: 15/178b lim: 30 exec/s: 27 rss: 69Mb L: 12/30 MS: 1 ChangeBit- 00:09:24.720 [2024-07-24 13:21:43.362084] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10804) > buf size (4096) 00:09:24.720 [2024-07-24 13:21:43.362377] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (143364) > buf size (4096) 00:09:24.720 [2024-07-24 13:21:43.362667] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (35980) > len (4) 00:09:24.720 [2024-07-24 13:21:43.363157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a8c008c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.720 [2024-07-24 13:21:43.363184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.720 [2024-07-24 13:21:43.363274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.720 [2024-07-24 13:21:43.363293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.720 [2024-07-24 13:21:43.363399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.720 [2024-07-24 13:21:43.363414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:24.720 #28 NEW cov: 11826 ft: 14624 corp: 16/196b lim: 30 exec/s: 28 rss: 69Mb L: 18/30 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\001"- 00:09:24.720 [2024-07-24 13:21:43.412484] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (535296) > buf size (4096) 00:09:24.720 [2024-07-24 13:21:43.412788] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1 00:09:24.720 [2024-07-24 13:21:43.413266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0abf02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.720 [2024-07-24 13:21:43.413297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.720 [2024-07-24 13:21:43.413395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.720 [2024-07-24 13:21:43.413410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.720 #29 NEW cov: 11826 ft: 14721 corp: 17/208b lim: 30 exec/s: 29 rss: 69Mb L: 12/30 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\001"- 00:09:24.720 [2024-07-24 13:21:43.463252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.720 [2024-07-24 13:21:43.463277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.720 #30 NEW cov: 11826 ft: 14789 corp: 18/218b lim: 30 exec/s: 30 rss: 69Mb L: 10/30 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\001"- 00:09:24.720 [2024-07-24 13:21:43.523422] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000aeae 00:09:24.720 [2024-07-24 13:21:43.523732] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000a6e9 00:09:24.720 [2024-07-24 13:21:43.524229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a3f02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.720 [2024-07-24 13:21:43.524272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.720 [2024-07-24 13:21:43.524362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffae02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.720 [2024-07-24 13:21:43.524377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.720 #31 NEW cov: 11826 ft: 14798 corp: 19/230b lim: 30 exec/s: 31 rss: 70Mb L: 12/30 MS: 1 ChangeBit- 00:09:24.720 [2024-07-24 13:21:43.583689] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (535296) > buf size (4096) 00:09:24.720 [2024-07-24 13:21:43.584184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0abf02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.720 [2024-07-24 13:21:43.584220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.977 #32 NEW cov: 11826 ft: 14803 corp: 20/236b lim: 30 exec/s: 32 rss: 70Mb L: 6/30 MS: 1 ChangeBinInt- 00:09:24.977 [2024-07-24 13:21:43.634336] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10804) > buf size (4096) 00:09:24.977 [2024-07-24 13:21:43.634621] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (143924) > buf size (4096) 00:09:24.977 [2024-07-24 13:21:43.634889] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (143924) > buf size (4096) 00:09:24.977 [2024-07-24 13:21:43.635391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a8c008c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.977 [2024-07-24 13:21:43.635421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.977 [2024-07-24 13:21:43.635516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8c8c008c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.977 [2024-07-24 13:21:43.635531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.977 [2024-07-24 13:21:43.635617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8c8c008c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.977 [2024-07-24 13:21:43.635634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:24.977 #33 NEW cov: 11826 ft: 14834 corp: 21/254b lim: 30 exec/s: 33 rss: 70Mb L: 18/30 MS: 1 CopyPart- 00:09:24.977 [2024-07-24 13:21:43.684645] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (535296) > buf size (4096) 00:09:24.977 [2024-07-24 13:21:43.685138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0abf0206 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.977 [2024-07-24 13:21:43.685167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.977 #34 NEW cov: 11826 ft: 14860 corp: 22/260b lim: 30 exec/s: 34 rss: 70Mb L: 6/30 MS: 1 ShuffleBytes- 00:09:24.977 [2024-07-24 13:21:43.745737] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000aeae 00:09:24.977 [2024-07-24 13:21:43.746245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.977 [2024-07-24 13:21:43.746273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.977 [2024-07-24 13:21:43.746378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.977 [2024-07-24 13:21:43.746393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.977 [2024-07-24 13:21:43.746484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00008101 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.977 [2024-07-24 13:21:43.746502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:24.977 #35 NEW cov: 11826 ft: 14902 corp: 23/278b lim: 30 exec/s: 35 rss: 70Mb L: 18/30 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\001"- 00:09:24.977 [2024-07-24 13:21:43.805536] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (39980) > buf size (4096) 00:09:24.977 [2024-07-24 13:21:43.807153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:270a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.977 [2024-07-24 13:21:43.807181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.977 [2024-07-24 13:21:43.807278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.977 [2024-07-24 13:21:43.807294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.977 [2024-07-24 13:21:43.807390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.977 [2024-07-24 13:21:43.807410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:24.977 [2024-07-24 13:21:43.807504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.977 [2024-07-24 13:21:43.807519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:24.977 [2024-07-24 13:21:43.807610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.977 [2024-07-24 13:21:43.807625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:24.977 #36 NEW cov: 11826 ft: 14964 corp: 24/308b lim: 30 exec/s: 36 rss: 70Mb L: 30/30 MS: 1 ChangeBinInt- 00:09:25.235 [2024-07-24 13:21:43.867671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.235 [2024-07-24 13:21:43.867698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:25.235 [2024-07-24 13:21:43.867796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.235 [2024-07-24 13:21:43.867811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:25.235 [2024-07-24 13:21:43.867899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.235 [2024-07-24 13:21:43.867914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:25.235 [2024-07-24 13:21:43.868009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.235 [2024-07-24 13:21:43.868024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:25.235 [2024-07-24 13:21:43.868115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.235 [2024-07-24 13:21:43.868131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:25.235 #37 NEW cov: 11826 ft: 14996 corp: 25/338b lim: 30 exec/s: 37 rss: 70Mb L: 30/30 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\001"- 00:09:25.235 [2024-07-24 13:21:43.926121] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10804) > buf size (4096) 00:09:25.235 [2024-07-24 13:21:43.926432] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (143924) > buf size (4096) 00:09:25.235 [2024-07-24 13:21:43.926933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a8c008c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.235 [2024-07-24 13:21:43.926960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:25.235 [2024-07-24 13:21:43.927051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8c8c008c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.235 [2024-07-24 13:21:43.927068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:25.235 #38 NEW cov: 11826 ft: 15048 corp: 26/350b lim: 30 exec/s: 38 rss: 70Mb L: 12/30 MS: 1 EraseBytes- 00:09:25.235 [2024-07-24 13:21:43.986547] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10804) > buf size (4096) 00:09:25.235 [2024-07-24 13:21:43.986850] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200008c0a 00:09:25.235 [2024-07-24 13:21:43.987150] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (143924) > buf size (4096) 00:09:25.235 [2024-07-24 13:21:43.987475] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (143924) > buf size (4096) 00:09:25.235 [2024-07-24 13:21:43.987985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a8c008c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.235 [2024-07-24 13:21:43.988013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:25.235 [2024-07-24 13:21:43.988107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0abf02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.235 [2024-07-24 13:21:43.988125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:25.235 [2024-07-24 13:21:43.988216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8c8c008c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.235 [2024-07-24 13:21:43.988232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:25.235 [2024-07-24 13:21:43.988329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:8c8c008c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.235 [2024-07-24 13:21:43.988344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:25.235 #39 NEW cov: 11826 ft: 15062 corp: 27/376b lim: 30 exec/s: 39 rss: 70Mb L: 26/30 MS: 1 CrossOver- 00:09:25.235 [2024-07-24 13:21:44.046649] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000aeae 00:09:25.235 [2024-07-24 13:21:44.047188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:aeae83ae cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.235 [2024-07-24 13:21:44.047220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:25.235 #40 NEW cov: 11826 ft: 15075 corp: 28/385b lim: 30 exec/s: 40 rss: 70Mb L: 9/30 MS: 1 EraseBytes- 00:09:25.235 [2024-07-24 13:21:44.097064] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (535296) > buf size (4096) 00:09:25.235 [2024-07-24 13:21:44.097567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0abf02ab cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.235 [2024-07-24 13:21:44.097595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:25.494 #41 NEW cov: 11826 ft: 15078 corp: 29/391b lim: 30 exec/s: 41 rss: 70Mb L: 6/30 MS: 1 ChangeBinInt- 00:09:25.494 [2024-07-24 13:21:44.147446] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:09:25.494 [2024-07-24 13:21:44.147952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.494 [2024-07-24 13:21:44.147979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:25.494 #42 NEW cov: 11826 ft: 15101 corp: 30/401b lim: 30 exec/s: 42 rss: 70Mb L: 10/30 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\001"- 00:09:25.494 [2024-07-24 13:21:44.197817] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534572) > buf size (4096) 00:09:25.494 [2024-07-24 13:21:44.198108] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (143924) > buf size (4096) 00:09:25.494 [2024-07-24 13:21:44.198427] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1 00:09:25.494 [2024-07-24 13:21:44.198983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a023a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.494 [2024-07-24 13:21:44.199010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:25.494 [2024-07-24 13:21:44.199099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8c8c008c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.494 [2024-07-24 13:21:44.199115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:25.495 [2024-07-24 13:21:44.199201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.495 [2024-07-24 13:21:44.199219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:25.495 #45 NEW cov: 11826 ft: 15118 corp: 31/422b lim: 30 exec/s: 45 rss: 70Mb L: 21/30 MS: 3 CopyPart-InsertByte-CrossOver- 00:09:25.495 [2024-07-24 13:21:44.247954] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10804) > buf size (4096) 00:09:25.495 [2024-07-24 13:21:44.248255] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200008cae 00:09:25.495 [2024-07-24 13:21:44.248765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a8c008c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.495 [2024-07-24 13:21:44.248794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:25.495 [2024-07-24 13:21:44.248883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:02bf02ae cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.495 [2024-07-24 13:21:44.248901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:25.495 #46 NEW cov: 11826 ft: 15150 corp: 32/434b lim: 30 exec/s: 23 rss: 70Mb L: 12/30 MS: 1 ChangeBit- 00:09:25.495 #46 DONE cov: 11826 ft: 15150 corp: 32/434b lim: 30 exec/s: 23 rss: 70Mb 00:09:25.495 ###### Recommended dictionary. ###### 00:09:25.495 "\000\000\000\000\000\000\000\001" # Uses: 5 00:09:25.495 ###### End of recommended dictionary. ###### 00:09:25.495 Done 46 runs in 2 second(s) 00:09:25.754 13:21:44 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:09:25.754 13:21:44 -- ../common.sh@72 -- # (( i++ )) 00:09:25.754 13:21:44 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:25.754 13:21:44 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:09:25.754 13:21:44 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:09:25.754 13:21:44 -- nvmf/run.sh@24 -- # local timen=1 00:09:25.754 13:21:44 -- nvmf/run.sh@25 -- # local core=0x1 00:09:25.754 13:21:44 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:09:25.754 13:21:44 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:09:25.754 13:21:44 -- nvmf/run.sh@29 -- # printf %02d 2 00:09:25.754 13:21:44 -- nvmf/run.sh@29 -- # port=4402 00:09:25.754 13:21:44 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:09:25.754 13:21:44 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:09:25.754 13:21:44 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:25.754 13:21:44 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:09:25.754 [2024-07-24 13:21:44.444879] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:25.754 [2024-07-24 13:21:44.444972] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3167892 ] 00:09:25.754 EAL: No free 2048 kB hugepages reported on node 1 00:09:26.012 [2024-07-24 13:21:44.744544] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:26.012 [2024-07-24 13:21:44.777090] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:26.012 [2024-07-24 13:21:44.777277] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:26.012 [2024-07-24 13:21:44.831848] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:26.012 [2024-07-24 13:21:44.848093] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:09:26.012 INFO: Running with entropic power schedule (0xFF, 100). 00:09:26.012 INFO: Seed: 1168404930 00:09:26.270 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:09:26.270 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:09:26.270 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:09:26.270 INFO: A corpus is not provided, starting from an empty corpus 00:09:26.270 #2 INITED exec/s: 0 rss: 61Mb 00:09:26.270 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:26.270 This may also happen if the target rejected all inputs we tried so far 00:09:26.270 [2024-07-24 13:21:44.904173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.270 [2024-07-24 13:21:44.904225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:26.270 [2024-07-24 13:21:44.904300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.270 [2024-07-24 13:21:44.904322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:26.270 [2024-07-24 13:21:44.904392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.270 [2024-07-24 13:21:44.904413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:26.270 [2024-07-24 13:21:44.904482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.270 [2024-07-24 13:21:44.904503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:26.529 NEW_FUNC[1/670]: 0x4a1a20 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:09:26.529 NEW_FUNC[2/670]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:26.529 #8 NEW cov: 11511 ft: 11510 corp: 2/33b lim: 35 exec/s: 0 rss: 68Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:09:26.529 [2024-07-24 13:21:45.374943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.529 [2024-07-24 13:21:45.374995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:26.529 [2024-07-24 13:21:45.375063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.529 [2024-07-24 13:21:45.375085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:26.787 #12 NEW cov: 11624 ft: 12595 corp: 3/47b lim: 35 exec/s: 0 rss: 68Mb L: 14/32 MS: 4 ShuffleBytes-CopyPart-ChangeBit-CrossOver- 00:09:26.787 [2024-07-24 13:21:45.425305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3a3a000a cdw11:3a003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.787 [2024-07-24 13:21:45.425344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:26.787 [2024-07-24 13:21:45.425409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:3a3a003a cdw11:3a003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.787 [2024-07-24 13:21:45.425434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:26.787 [2024-07-24 13:21:45.425498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:3a3a003a cdw11:3a003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.787 [2024-07-24 13:21:45.425519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:26.787 [2024-07-24 13:21:45.425583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:3a3a003a cdw11:3a003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.787 [2024-07-24 13:21:45.425603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:26.787 #15 NEW cov: 11630 ft: 12786 corp: 4/80b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 3 InsertByte-ChangeByte-InsertRepeatedBytes- 00:09:26.787 [2024-07-24 13:21:45.464937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a3a000a cdw11:3a003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.787 [2024-07-24 13:21:45.464974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:26.787 #17 NEW cov: 11715 ft: 13390 corp: 5/89b lim: 35 exec/s: 0 rss: 68Mb L: 9/33 MS: 2 ShuffleBytes-CrossOver- 00:09:26.787 [2024-07-24 13:21:45.515552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3a3a000a cdw11:3a003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.787 [2024-07-24 13:21:45.515589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:26.787 [2024-07-24 13:21:45.515657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:3a3b003a cdw11:3a003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.787 [2024-07-24 13:21:45.515677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:26.787 [2024-07-24 13:21:45.515741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:3a3a003a cdw11:3a003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.787 [2024-07-24 13:21:45.515760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:26.787 [2024-07-24 13:21:45.515823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:3a3a003a cdw11:3a003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.787 [2024-07-24 13:21:45.515843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:26.787 #18 NEW cov: 11715 ft: 13473 corp: 6/123b lim: 35 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 InsertByte- 00:09:26.787 [2024-07-24 13:21:45.575379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.787 [2024-07-24 13:21:45.575415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:26.787 [2024-07-24 13:21:45.575483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.788 [2024-07-24 13:21:45.575504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:26.788 #19 NEW cov: 11715 ft: 13543 corp: 7/137b lim: 35 exec/s: 0 rss: 69Mb L: 14/34 MS: 1 CopyPart- 00:09:26.788 [2024-07-24 13:21:45.635570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.788 [2024-07-24 13:21:45.635606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:26.788 [2024-07-24 13:21:45.635676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.788 [2024-07-24 13:21:45.635697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.047 #20 NEW cov: 11715 ft: 13622 corp: 8/151b lim: 35 exec/s: 0 rss: 69Mb L: 14/34 MS: 1 CrossOver- 00:09:27.047 [2024-07-24 13:21:45.696021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.047 [2024-07-24 13:21:45.696058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.047 [2024-07-24 13:21:45.696125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.047 [2024-07-24 13:21:45.696146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.047 [2024-07-24 13:21:45.696217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.047 [2024-07-24 13:21:45.696238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.047 [2024-07-24 13:21:45.696308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:b5b500b5 cdw11:ff00b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.047 [2024-07-24 13:21:45.696328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:27.047 #21 NEW cov: 11715 ft: 13663 corp: 9/185b lim: 35 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:09:27.047 [2024-07-24 13:21:45.756184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.047 [2024-07-24 13:21:45.756225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.047 [2024-07-24 13:21:45.756293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.047 [2024-07-24 13:21:45.756313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.047 [2024-07-24 13:21:45.756378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.047 [2024-07-24 13:21:45.756398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.047 [2024-07-24 13:21:45.756461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.047 [2024-07-24 13:21:45.756481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:27.047 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:27.047 #22 NEW cov: 11738 ft: 13778 corp: 10/218b lim: 35 exec/s: 0 rss: 69Mb L: 33/34 MS: 1 CopyPart- 00:09:27.047 [2024-07-24 13:21:45.806041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.047 [2024-07-24 13:21:45.806077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.047 [2024-07-24 13:21:45.806144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:26ff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.047 [2024-07-24 13:21:45.806165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.047 #23 NEW cov: 11738 ft: 13817 corp: 11/232b lim: 35 exec/s: 0 rss: 69Mb L: 14/34 MS: 1 ChangeByte- 00:09:27.047 [2024-07-24 13:21:45.856146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.047 [2024-07-24 13:21:45.856181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.047 [2024-07-24 13:21:45.856256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff0026ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.047 [2024-07-24 13:21:45.856278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.047 #24 NEW cov: 11738 ft: 13839 corp: 12/246b lim: 35 exec/s: 24 rss: 69Mb L: 14/34 MS: 1 ShuffleBytes- 00:09:27.306 [2024-07-24 13:21:45.916363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dfff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.306 [2024-07-24 13:21:45.916398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.306 [2024-07-24 13:21:45.916466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.306 [2024-07-24 13:21:45.916486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.306 #25 NEW cov: 11738 ft: 13867 corp: 13/260b lim: 35 exec/s: 25 rss: 69Mb L: 14/34 MS: 1 ChangeBit- 00:09:27.306 [2024-07-24 13:21:45.976811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3a3a000a cdw11:3a003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.306 [2024-07-24 13:21:45.976847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.306 [2024-07-24 13:21:45.976914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:3a3a003a cdw11:3a003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.306 [2024-07-24 13:21:45.976935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.306 [2024-07-24 13:21:45.977000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:3a3a003a cdw11:3a003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.306 [2024-07-24 13:21:45.977020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.306 [2024-07-24 13:21:45.977084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:3aff003a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.306 [2024-07-24 13:21:45.977104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:27.306 #26 NEW cov: 11738 ft: 13916 corp: 14/293b lim: 35 exec/s: 26 rss: 69Mb L: 33/34 MS: 1 CrossOver- 00:09:27.306 [2024-07-24 13:21:46.026655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.306 [2024-07-24 13:21:46.026690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.306 [2024-07-24 13:21:46.026758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.306 [2024-07-24 13:21:46.026778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.306 #27 NEW cov: 11738 ft: 13923 corp: 15/313b lim: 35 exec/s: 27 rss: 69Mb L: 20/34 MS: 1 CopyPart- 00:09:27.306 [2024-07-24 13:21:46.077083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3a3a000a cdw11:3a003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.306 [2024-07-24 13:21:46.077123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.306 [2024-07-24 13:21:46.077189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:3a3a003a cdw11:3a003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.306 [2024-07-24 13:21:46.077216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.306 [2024-07-24 13:21:46.077282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:3a3a003a cdw11:3a003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.306 [2024-07-24 13:21:46.077303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.306 [2024-07-24 13:21:46.077368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:3aff003a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.306 [2024-07-24 13:21:46.077389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:27.306 #28 NEW cov: 11738 ft: 13955 corp: 16/346b lim: 35 exec/s: 28 rss: 69Mb L: 33/34 MS: 1 ShuffleBytes- 00:09:27.306 [2024-07-24 13:21:46.137253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.306 [2024-07-24 13:21:46.137291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.306 [2024-07-24 13:21:46.137359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffd300ff cdw11:d300d3d3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.306 [2024-07-24 13:21:46.137380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.306 [2024-07-24 13:21:46.137445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:d3d300d3 cdw11:d300d3d3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.306 [2024-07-24 13:21:46.137465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.306 [2024-07-24 13:21:46.137527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:d3d300d3 cdw11:d300d3d3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.306 [2024-07-24 13:21:46.137547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:27.565 #29 NEW cov: 11738 ft: 13963 corp: 17/379b lim: 35 exec/s: 29 rss: 70Mb L: 33/34 MS: 1 InsertRepeatedBytes- 00:09:27.565 [2024-07-24 13:21:46.197387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3a3a000a cdw11:3a003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.565 [2024-07-24 13:21:46.197423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.565 [2024-07-24 13:21:46.197486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:3a3a003a cdw11:3a003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.565 [2024-07-24 13:21:46.197507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.565 [2024-07-24 13:21:46.197573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:3a3a002c cdw11:3a003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.565 [2024-07-24 13:21:46.197593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.565 [2024-07-24 13:21:46.197656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:3aff003a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.565 [2024-07-24 13:21:46.197675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:27.565 #30 NEW cov: 11738 ft: 14007 corp: 18/412b lim: 35 exec/s: 30 rss: 70Mb L: 33/34 MS: 1 ChangeByte- 00:09:27.565 [2024-07-24 13:21:46.257228] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:27.565 [2024-07-24 13:21:46.257510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.565 [2024-07-24 13:21:46.257546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.565 [2024-07-24 13:21:46.257615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:000026ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.565 [2024-07-24 13:21:46.257636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.565 [2024-07-24 13:21:46.257703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.565 [2024-07-24 13:21:46.257727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.565 #31 NEW cov: 11747 ft: 14208 corp: 19/434b lim: 35 exec/s: 31 rss: 70Mb L: 22/34 MS: 1 InsertRepeatedBytes- 00:09:27.565 [2024-07-24 13:21:46.307727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:b500ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.565 [2024-07-24 13:21:46.307763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.565 [2024-07-24 13:21:46.307829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.565 [2024-07-24 13:21:46.307850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.566 [2024-07-24 13:21:46.307915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.566 [2024-07-24 13:21:46.307935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.566 [2024-07-24 13:21:46.307998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:b5b500b5 cdw11:ff00b5ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.566 [2024-07-24 13:21:46.308019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:27.566 #32 NEW cov: 11747 ft: 14210 corp: 20/468b lim: 35 exec/s: 32 rss: 70Mb L: 34/34 MS: 1 CopyPart- 00:09:27.566 [2024-07-24 13:21:46.367931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:b500ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.566 [2024-07-24 13:21:46.367968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.566 [2024-07-24 13:21:46.368035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.566 [2024-07-24 13:21:46.368057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.566 [2024-07-24 13:21:46.368120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.566 [2024-07-24 13:21:46.368140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.566 [2024-07-24 13:21:46.368205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:b5b500b5 cdw11:ff00b5ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.566 [2024-07-24 13:21:46.368236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:27.566 #33 NEW cov: 11747 ft: 14221 corp: 21/502b lim: 35 exec/s: 33 rss: 70Mb L: 34/34 MS: 1 ShuffleBytes- 00:09:27.566 [2024-07-24 13:21:46.428060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3a3a000a cdw11:3a003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.566 [2024-07-24 13:21:46.428096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.566 [2024-07-24 13:21:46.428164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:3a3a003a cdw11:3a003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.566 [2024-07-24 13:21:46.428185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.566 [2024-07-24 13:21:46.428251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:3a3a003a cdw11:3a003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.566 [2024-07-24 13:21:46.428272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.566 [2024-07-24 13:21:46.428336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:3a3a003a cdw11:30003a3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.566 [2024-07-24 13:21:46.428355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:27.824 #34 NEW cov: 11747 ft: 14225 corp: 22/535b lim: 35 exec/s: 34 rss: 70Mb L: 33/34 MS: 1 ChangeByte- 00:09:27.824 [2024-07-24 13:21:46.467857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.824 [2024-07-24 13:21:46.467893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.824 [2024-07-24 13:21:46.467958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.824 [2024-07-24 13:21:46.467980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.824 #35 NEW cov: 11747 ft: 14251 corp: 23/549b lim: 35 exec/s: 35 rss: 70Mb L: 14/34 MS: 1 ShuffleBytes- 00:09:27.824 [2024-07-24 13:21:46.518022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dfff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.824 [2024-07-24 13:21:46.518058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.824 [2024-07-24 13:21:46.518126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:feff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.824 [2024-07-24 13:21:46.518146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.824 #36 NEW cov: 11747 ft: 14265 corp: 24/563b lim: 35 exec/s: 36 rss: 70Mb L: 14/34 MS: 1 ChangeBit- 00:09:27.824 [2024-07-24 13:21:46.578218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.824 [2024-07-24 13:21:46.578254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.824 [2024-07-24 13:21:46.578319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:f7ff00ff cdw11:ff0026ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.824 [2024-07-24 13:21:46.578340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.824 #37 NEW cov: 11747 ft: 14279 corp: 25/577b lim: 35 exec/s: 37 rss: 70Mb L: 14/34 MS: 1 ChangeBit- 00:09:27.824 [2024-07-24 13:21:46.628496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.824 [2024-07-24 13:21:46.628532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.824 [2024-07-24 13:21:46.628597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.824 [2024-07-24 13:21:46.628618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.824 [2024-07-24 13:21:46.628687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.824 [2024-07-24 13:21:46.628707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.825 #38 NEW cov: 11747 ft: 14328 corp: 26/599b lim: 35 exec/s: 38 rss: 70Mb L: 22/34 MS: 1 EraseBytes- 00:09:27.825 [2024-07-24 13:21:46.688839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:b500ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.825 [2024-07-24 13:21:46.688875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.825 [2024-07-24 13:21:46.688940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:f500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.825 [2024-07-24 13:21:46.688961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.825 [2024-07-24 13:21:46.689026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.825 [2024-07-24 13:21:46.689045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.825 [2024-07-24 13:21:46.689109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:b5b500b5 cdw11:ff00b5ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.825 [2024-07-24 13:21:46.689130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:28.083 #39 NEW cov: 11747 ft: 14331 corp: 27/633b lim: 35 exec/s: 39 rss: 70Mb L: 34/34 MS: 1 ChangeBit- 00:09:28.083 [2024-07-24 13:21:46.738521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:28.083 [2024-07-24 13:21:46.738556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:28.083 #40 NEW cov: 11747 ft: 14354 corp: 28/642b lim: 35 exec/s: 40 rss: 70Mb L: 9/34 MS: 1 EraseBytes- 00:09:28.083 [2024-07-24 13:21:46.788822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:28.083 [2024-07-24 13:21:46.788858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:28.083 [2024-07-24 13:21:46.788925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0eff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:28.083 [2024-07-24 13:21:46.788945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:28.083 #41 NEW cov: 11747 ft: 14388 corp: 29/656b lim: 35 exec/s: 41 rss: 70Mb L: 14/34 MS: 1 ChangeBinInt- 00:09:28.083 [2024-07-24 13:21:46.839091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0aff0021 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:28.083 [2024-07-24 13:21:46.839127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:28.083 [2024-07-24 13:21:46.839198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:28.083 [2024-07-24 13:21:46.839224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:28.083 #42 NEW cov: 11747 ft: 14436 corp: 30/677b lim: 35 exec/s: 42 rss: 70Mb L: 21/34 MS: 1 InsertByte- 00:09:28.083 [2024-07-24 13:21:46.899114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dfff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:28.083 [2024-07-24 13:21:46.899150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:28.083 [2024-07-24 13:21:46.899221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:fbff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:28.083 [2024-07-24 13:21:46.899242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:28.083 #43 NEW cov: 11747 ft: 14466 corp: 31/691b lim: 35 exec/s: 21 rss: 70Mb L: 14/34 MS: 1 ChangeBit- 00:09:28.083 #43 DONE cov: 11747 ft: 14466 corp: 31/691b lim: 35 exec/s: 21 rss: 70Mb 00:09:28.083 Done 43 runs in 2 second(s) 00:09:28.343 13:21:47 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:09:28.343 13:21:47 -- ../common.sh@72 -- # (( i++ )) 00:09:28.343 13:21:47 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:28.343 13:21:47 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:09:28.343 13:21:47 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:09:28.343 13:21:47 -- nvmf/run.sh@24 -- # local timen=1 00:09:28.343 13:21:47 -- nvmf/run.sh@25 -- # local core=0x1 00:09:28.343 13:21:47 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:09:28.343 13:21:47 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:09:28.343 13:21:47 -- nvmf/run.sh@29 -- # printf %02d 3 00:09:28.343 13:21:47 -- nvmf/run.sh@29 -- # port=4403 00:09:28.343 13:21:47 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:09:28.343 13:21:47 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:09:28.343 13:21:47 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:28.343 13:21:47 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:09:28.343 [2024-07-24 13:21:47.109931] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:28.343 [2024-07-24 13:21:47.110002] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3168258 ] 00:09:28.343 EAL: No free 2048 kB hugepages reported on node 1 00:09:28.602 [2024-07-24 13:21:47.446910] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:28.860 [2024-07-24 13:21:47.482194] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:28.860 [2024-07-24 13:21:47.482380] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:28.860 [2024-07-24 13:21:47.537078] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:28.860 [2024-07-24 13:21:47.553325] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:09:28.860 INFO: Running with entropic power schedule (0xFF, 100). 00:09:28.860 INFO: Seed: 3875409140 00:09:28.860 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:09:28.860 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:09:28.860 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:09:28.861 INFO: A corpus is not provided, starting from an empty corpus 00:09:28.861 #2 INITED exec/s: 0 rss: 61Mb 00:09:28.861 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:28.861 This may also happen if the target rejected all inputs we tried so far 00:09:29.428 NEW_FUNC[1/654]: 0x4a36f0 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:09:29.428 NEW_FUNC[2/654]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:29.428 #4 NEW cov: 11365 ft: 11366 corp: 2/5b lim: 20 exec/s: 0 rss: 68Mb L: 4/4 MS: 2 InsertByte-CopyPart- 00:09:29.428 NEW_FUNC[1/5]: 0x16decf0 in spdk_nvme_qpair_process_completions /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:757 00:09:29.429 NEW_FUNC[2/5]: 0x17402a0 in nvme_transport_qpair_process_completions /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_transport.c:606 00:09:29.429 #5 NEW cov: 11522 ft: 11899 corp: 3/11b lim: 20 exec/s: 0 rss: 68Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:09:29.429 [2024-07-24 13:21:48.180043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:29.429 [2024-07-24 13:21:48.180105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:29.429 NEW_FUNC[1/20]: 0x115bea0 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3224 00:09:29.429 NEW_FUNC[2/20]: 0x115ca20 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3166 00:09:29.429 #7 NEW cov: 11876 ft: 13019 corp: 4/30b lim: 20 exec/s: 0 rss: 69Mb L: 19/19 MS: 2 ChangeByte-InsertRepeatedBytes- 00:09:29.688 #8 NEW cov: 11961 ft: 13270 corp: 5/36b lim: 20 exec/s: 0 rss: 69Mb L: 6/19 MS: 1 CMP- DE: "\000\000\000\000"- 00:09:29.688 #9 NEW cov: 11961 ft: 13381 corp: 6/43b lim: 20 exec/s: 0 rss: 69Mb L: 7/19 MS: 1 InsertByte- 00:09:29.688 [2024-07-24 13:21:48.450566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:29.688 [2024-07-24 13:21:48.450616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:29.688 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:29.688 #10 NEW cov: 11989 ft: 13706 corp: 7/54b lim: 20 exec/s: 0 rss: 69Mb L: 11/19 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:09:29.688 [2024-07-24 13:21:48.551017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:29.688 [2024-07-24 13:21:48.551062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:29.948 #11 NEW cov: 11992 ft: 13826 corp: 8/73b lim: 20 exec/s: 11 rss: 69Mb L: 19/19 MS: 1 ChangeByte- 00:09:29.948 #12 NEW cov: 11992 ft: 13930 corp: 9/84b lim: 20 exec/s: 12 rss: 69Mb L: 11/19 MS: 1 CMP- DE: "\000\000\002\000"- 00:09:29.948 #13 NEW cov: 11992 ft: 13993 corp: 10/101b lim: 20 exec/s: 13 rss: 69Mb L: 17/19 MS: 1 InsertRepeatedBytes- 00:09:30.206 #14 NEW cov: 11996 ft: 14163 corp: 11/116b lim: 20 exec/s: 14 rss: 69Mb L: 15/19 MS: 1 EraseBytes- 00:09:30.206 #15 NEW cov: 11996 ft: 14257 corp: 12/121b lim: 20 exec/s: 15 rss: 69Mb L: 5/19 MS: 1 EraseBytes- 00:09:30.206 #16 NEW cov: 11996 ft: 14327 corp: 13/136b lim: 20 exec/s: 16 rss: 69Mb L: 15/19 MS: 1 PersAutoDict- DE: "\000\000\002\000"- 00:09:30.206 #17 NEW cov: 11996 ft: 14415 corp: 14/148b lim: 20 exec/s: 17 rss: 69Mb L: 12/19 MS: 1 InsertByte- 00:09:30.465 #18 NEW cov: 11996 ft: 14441 corp: 15/160b lim: 20 exec/s: 18 rss: 69Mb L: 12/19 MS: 1 ChangeBit- 00:09:30.465 #19 NEW cov: 11996 ft: 14457 corp: 16/175b lim: 20 exec/s: 19 rss: 69Mb L: 15/19 MS: 1 ChangeBinInt- 00:09:30.465 #20 NEW cov: 11996 ft: 14487 corp: 17/179b lim: 20 exec/s: 20 rss: 69Mb L: 4/19 MS: 1 ShuffleBytes- 00:09:30.465 [2024-07-24 13:21:49.253799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:30.465 [2024-07-24 13:21:49.253846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:30.465 #21 NEW cov: 11996 ft: 14563 corp: 18/198b lim: 20 exec/s: 21 rss: 70Mb L: 19/19 MS: 1 ShuffleBytes- 00:09:30.465 [2024-07-24 13:21:49.313640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:30.465 [2024-07-24 13:21:49.313680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:30.723 #22 NEW cov: 11996 ft: 14641 corp: 19/209b lim: 20 exec/s: 22 rss: 70Mb L: 11/19 MS: 1 CopyPart- 00:09:30.723 [2024-07-24 13:21:49.364019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:2 nsid:0 cdw10:00000000 cdw11:00000000 00:09:30.723 [2024-07-24 13:21:49.364059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:2 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:30.723 #23 NEW cov: 11996 ft: 14665 corp: 20/225b lim: 20 exec/s: 23 rss: 70Mb L: 16/19 MS: 1 InsertByte- 00:09:30.723 #24 NEW cov: 11996 ft: 14682 corp: 21/229b lim: 20 exec/s: 24 rss: 70Mb L: 4/19 MS: 1 CrossOver- 00:09:30.723 #25 NEW cov: 11996 ft: 14694 corp: 22/234b lim: 20 exec/s: 25 rss: 70Mb L: 5/19 MS: 1 CrossOver- 00:09:30.723 [2024-07-24 13:21:49.524543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:30.723 [2024-07-24 13:21:49.524581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:30.723 #26 NEW cov: 11996 ft: 14804 corp: 23/252b lim: 20 exec/s: 26 rss: 70Mb L: 18/19 MS: 1 InsertRepeatedBytes- 00:09:30.982 #27 NEW cov: 11996 ft: 14826 corp: 24/263b lim: 20 exec/s: 13 rss: 70Mb L: 11/19 MS: 1 ShuffleBytes- 00:09:30.982 #27 DONE cov: 11996 ft: 14826 corp: 24/263b lim: 20 exec/s: 13 rss: 70Mb 00:09:30.982 ###### Recommended dictionary. ###### 00:09:30.982 "\000\000\000\000" # Uses: 1 00:09:30.982 "\000\000\002\000" # Uses: 1 00:09:30.982 ###### End of recommended dictionary. ###### 00:09:30.982 Done 27 runs in 2 second(s) 00:09:30.982 13:21:49 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:09:30.982 13:21:49 -- ../common.sh@72 -- # (( i++ )) 00:09:30.982 13:21:49 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:30.982 13:21:49 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:09:30.982 13:21:49 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:09:30.982 13:21:49 -- nvmf/run.sh@24 -- # local timen=1 00:09:30.982 13:21:49 -- nvmf/run.sh@25 -- # local core=0x1 00:09:30.982 13:21:49 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:09:30.982 13:21:49 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:09:30.982 13:21:49 -- nvmf/run.sh@29 -- # printf %02d 4 00:09:30.982 13:21:49 -- nvmf/run.sh@29 -- # port=4404 00:09:30.982 13:21:49 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:09:30.982 13:21:49 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:09:30.982 13:21:49 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:30.982 13:21:49 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:09:30.982 [2024-07-24 13:21:49.798460] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:30.982 [2024-07-24 13:21:49.798534] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3168626 ] 00:09:31.241 EAL: No free 2048 kB hugepages reported on node 1 00:09:31.241 [2024-07-24 13:21:50.037615] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:31.241 [2024-07-24 13:21:50.064938] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:31.241 [2024-07-24 13:21:50.065108] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:31.500 [2024-07-24 13:21:50.119638] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:31.500 [2024-07-24 13:21:50.135860] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:09:31.500 INFO: Running with entropic power schedule (0xFF, 100). 00:09:31.500 INFO: Seed: 2162431644 00:09:31.500 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:09:31.500 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:09:31.500 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:09:31.500 INFO: A corpus is not provided, starting from an empty corpus 00:09:31.500 #2 INITED exec/s: 0 rss: 61Mb 00:09:31.500 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:31.500 This may also happen if the target rejected all inputs we tried so far 00:09:31.500 [2024-07-24 13:21:50.191314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.500 [2024-07-24 13:21:50.191360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:31.500 [2024-07-24 13:21:50.191411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.500 [2024-07-24 13:21:50.191436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:31.500 [2024-07-24 13:21:50.191482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.500 [2024-07-24 13:21:50.191506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:31.500 [2024-07-24 13:21:50.191550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.500 [2024-07-24 13:21:50.191574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:32.066 NEW_FUNC[1/671]: 0x4a47e0 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:09:32.066 NEW_FUNC[2/671]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:32.066 #12 NEW cov: 11532 ft: 11533 corp: 2/33b lim: 35 exec/s: 0 rss: 68Mb L: 32/32 MS: 5 ChangeBit-ShuffleBytes-ShuffleBytes-ChangeBinInt-InsertRepeatedBytes- 00:09:32.066 [2024-07-24 13:21:50.702910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.066 [2024-07-24 13:21:50.702970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.066 #15 NEW cov: 11645 ft: 13023 corp: 3/42b lim: 35 exec/s: 0 rss: 68Mb L: 9/32 MS: 3 ShuffleBytes-CopyPart-CMP- DE: "\001\000\000\000\000\000\000\000"- 00:09:32.066 [2024-07-24 13:21:50.763435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.066 [2024-07-24 13:21:50.763471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.066 [2024-07-24 13:21:50.763540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.066 [2024-07-24 13:21:50.763561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:32.067 [2024-07-24 13:21:50.763630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.067 [2024-07-24 13:21:50.763649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:32.067 [2024-07-24 13:21:50.763714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.067 [2024-07-24 13:21:50.763734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:32.067 #21 NEW cov: 11651 ft: 13279 corp: 4/74b lim: 35 exec/s: 0 rss: 68Mb L: 32/32 MS: 1 ChangeByte- 00:09:32.067 [2024-07-24 13:21:50.823608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.067 [2024-07-24 13:21:50.823643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.067 [2024-07-24 13:21:50.823711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.067 [2024-07-24 13:21:50.823731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:32.067 [2024-07-24 13:21:50.823795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.067 [2024-07-24 13:21:50.823815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:32.067 [2024-07-24 13:21:50.823877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00010000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.067 [2024-07-24 13:21:50.823897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:32.067 #22 NEW cov: 11736 ft: 13570 corp: 5/106b lim: 35 exec/s: 0 rss: 68Mb L: 32/32 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:09:32.067 [2024-07-24 13:21:50.873182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.067 [2024-07-24 13:21:50.873222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.067 #23 NEW cov: 11736 ft: 13665 corp: 6/115b lim: 35 exec/s: 0 rss: 68Mb L: 9/32 MS: 1 CopyPart- 00:09:32.325 [2024-07-24 13:21:50.933867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.325 [2024-07-24 13:21:50.933902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.325 [2024-07-24 13:21:50.933970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.325 [2024-07-24 13:21:50.933990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:32.325 [2024-07-24 13:21:50.934056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.325 [2024-07-24 13:21:50.934075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:32.325 [2024-07-24 13:21:50.934138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.325 [2024-07-24 13:21:50.934162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:32.325 #24 NEW cov: 11736 ft: 13749 corp: 7/147b lim: 35 exec/s: 0 rss: 69Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:32.325 [2024-07-24 13:21:50.993570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.325 [2024-07-24 13:21:50.993604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.325 #25 NEW cov: 11736 ft: 13815 corp: 8/156b lim: 35 exec/s: 0 rss: 69Mb L: 9/32 MS: 1 ChangeBinInt- 00:09:32.325 [2024-07-24 13:21:51.053859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.325 [2024-07-24 13:21:51.053896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.325 [2024-07-24 13:21:51.053961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.325 [2024-07-24 13:21:51.053981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:32.325 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:32.325 #26 NEW cov: 11753 ft: 14085 corp: 9/172b lim: 35 exec/s: 0 rss: 69Mb L: 16/32 MS: 1 InsertRepeatedBytes- 00:09:32.325 [2024-07-24 13:21:51.103848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.325 [2024-07-24 13:21:51.103882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.325 #29 NEW cov: 11753 ft: 14099 corp: 10/184b lim: 35 exec/s: 0 rss: 69Mb L: 12/32 MS: 3 ShuffleBytes-CrossOver-CrossOver- 00:09:32.325 [2024-07-24 13:21:51.153967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.325 [2024-07-24 13:21:51.154002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.325 #34 NEW cov: 11753 ft: 14149 corp: 11/195b lim: 35 exec/s: 34 rss: 69Mb L: 11/32 MS: 5 CrossOver-CrossOver-ChangeByte-CMP-PersAutoDict- DE: "\000\000"-"\001\000\000\000\000\000\000\000"- 00:09:32.584 [2024-07-24 13:21:51.204728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.584 [2024-07-24 13:21:51.204763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.584 [2024-07-24 13:21:51.204829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00090000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.584 [2024-07-24 13:21:51.204848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:32.584 [2024-07-24 13:21:51.204913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.584 [2024-07-24 13:21:51.204933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:32.584 [2024-07-24 13:21:51.204997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00010000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.584 [2024-07-24 13:21:51.205017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:32.584 #35 NEW cov: 11753 ft: 14170 corp: 12/227b lim: 35 exec/s: 35 rss: 69Mb L: 32/32 MS: 1 CMP- DE: "\011\000\000\000"- 00:09:32.584 [2024-07-24 13:21:51.264906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:36363636 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.584 [2024-07-24 13:21:51.264941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.584 [2024-07-24 13:21:51.265010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:36363636 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.584 [2024-07-24 13:21:51.265030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:32.584 [2024-07-24 13:21:51.265095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:36363636 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.584 [2024-07-24 13:21:51.265116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:32.584 [2024-07-24 13:21:51.265180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.584 [2024-07-24 13:21:51.265199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:32.584 #36 NEW cov: 11753 ft: 14187 corp: 13/257b lim: 35 exec/s: 36 rss: 69Mb L: 30/32 MS: 1 InsertRepeatedBytes- 00:09:32.584 [2024-07-24 13:21:51.314427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.584 [2024-07-24 13:21:51.314461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.584 #37 NEW cov: 11753 ft: 14247 corp: 14/266b lim: 35 exec/s: 37 rss: 69Mb L: 9/32 MS: 1 ChangeBit- 00:09:32.584 [2024-07-24 13:21:51.374761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.584 [2024-07-24 13:21:51.374796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.584 [2024-07-24 13:21:51.374866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.585 [2024-07-24 13:21:51.374886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:32.585 #38 NEW cov: 11753 ft: 14252 corp: 15/285b lim: 35 exec/s: 38 rss: 69Mb L: 19/32 MS: 1 CrossOver- 00:09:32.585 [2024-07-24 13:21:51.424956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.585 [2024-07-24 13:21:51.424990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.585 [2024-07-24 13:21:51.425059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00004000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.585 [2024-07-24 13:21:51.425079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:32.844 #39 NEW cov: 11753 ft: 14272 corp: 16/304b lim: 35 exec/s: 39 rss: 69Mb L: 19/32 MS: 1 ChangeBit- 00:09:32.844 [2024-07-24 13:21:51.485495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.844 [2024-07-24 13:21:51.485529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.844 [2024-07-24 13:21:51.485597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.844 [2024-07-24 13:21:51.485617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:32.844 [2024-07-24 13:21:51.485689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.844 [2024-07-24 13:21:51.485709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:32.844 [2024-07-24 13:21:51.485771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.844 [2024-07-24 13:21:51.485791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:32.844 #40 NEW cov: 11753 ft: 14313 corp: 17/337b lim: 35 exec/s: 40 rss: 69Mb L: 33/33 MS: 1 CopyPart- 00:09:32.844 [2024-07-24 13:21:51.545714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.844 [2024-07-24 13:21:51.545749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.844 [2024-07-24 13:21:51.545817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.844 [2024-07-24 13:21:51.545837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:32.844 [2024-07-24 13:21:51.545900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.844 [2024-07-24 13:21:51.545920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:32.844 [2024-07-24 13:21:51.545983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.844 [2024-07-24 13:21:51.546002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:32.844 [2024-07-24 13:21:51.595869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.844 [2024-07-24 13:21:51.595903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.844 [2024-07-24 13:21:51.595970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.844 [2024-07-24 13:21:51.595990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:32.844 [2024-07-24 13:21:51.596057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.844 [2024-07-24 13:21:51.596076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:32.844 [2024-07-24 13:21:51.596140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.844 [2024-07-24 13:21:51.596161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:32.844 #42 NEW cov: 11753 ft: 14344 corp: 18/369b lim: 35 exec/s: 42 rss: 70Mb L: 32/33 MS: 2 ChangeBinInt-CrossOver- 00:09:32.844 [2024-07-24 13:21:51.646008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000db00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.844 [2024-07-24 13:21:51.646043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.844 [2024-07-24 13:21:51.646117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.844 [2024-07-24 13:21:51.646136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:32.844 [2024-07-24 13:21:51.646199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.844 [2024-07-24 13:21:51.646224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:32.844 [2024-07-24 13:21:51.646287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.844 [2024-07-24 13:21:51.646306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:32.844 #43 NEW cov: 11753 ft: 14360 corp: 19/401b lim: 35 exec/s: 43 rss: 70Mb L: 32/33 MS: 1 ChangeByte- 00:09:32.844 [2024-07-24 13:21:51.705951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.844 [2024-07-24 13:21:51.705986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.844 [2024-07-24 13:21:51.706056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00094000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.844 [2024-07-24 13:21:51.706076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:32.844 [2024-07-24 13:21:51.706143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.844 [2024-07-24 13:21:51.706162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:33.103 #44 NEW cov: 11753 ft: 14598 corp: 20/424b lim: 35 exec/s: 44 rss: 70Mb L: 23/33 MS: 1 PersAutoDict- DE: "\011\000\000\000"- 00:09:33.103 [2024-07-24 13:21:51.766343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.103 [2024-07-24 13:21:51.766378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:33.103 [2024-07-24 13:21:51.766446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.103 [2024-07-24 13:21:51.766467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:33.103 [2024-07-24 13:21:51.766531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00020000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.103 [2024-07-24 13:21:51.766550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:33.103 [2024-07-24 13:21:51.766614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.103 [2024-07-24 13:21:51.766634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:33.103 #45 NEW cov: 11753 ft: 14605 corp: 21/456b lim: 35 exec/s: 45 rss: 70Mb L: 32/33 MS: 1 ChangeBinInt- 00:09:33.103 [2024-07-24 13:21:51.816490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.103 [2024-07-24 13:21:51.816525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:33.103 [2024-07-24 13:21:51.816596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.103 [2024-07-24 13:21:51.816616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:33.103 [2024-07-24 13:21:51.816680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.103 [2024-07-24 13:21:51.816701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:33.103 [2024-07-24 13:21:51.816765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:002a0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.103 [2024-07-24 13:21:51.816785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:33.103 #46 NEW cov: 11753 ft: 14649 corp: 22/488b lim: 35 exec/s: 46 rss: 70Mb L: 32/33 MS: 1 ChangeByte- 00:09:33.103 [2024-07-24 13:21:51.866048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.103 [2024-07-24 13:21:51.866083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:33.103 #47 NEW cov: 11753 ft: 14665 corp: 23/499b lim: 35 exec/s: 47 rss: 70Mb L: 11/33 MS: 1 EraseBytes- 00:09:33.103 [2024-07-24 13:21:51.916900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.103 [2024-07-24 13:21:51.916936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:33.103 [2024-07-24 13:21:51.917001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.103 [2024-07-24 13:21:51.917022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:33.103 [2024-07-24 13:21:51.917087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.103 [2024-07-24 13:21:51.917107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:33.103 [2024-07-24 13:21:51.917171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00010000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.103 [2024-07-24 13:21:51.917191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:33.103 [2024-07-24 13:21:51.917256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.103 [2024-07-24 13:21:51.917276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:33.103 #48 NEW cov: 11753 ft: 14784 corp: 24/534b lim: 35 exec/s: 48 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:09:33.103 [2024-07-24 13:21:51.966352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.103 [2024-07-24 13:21:51.966388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:33.363 #49 NEW cov: 11753 ft: 14811 corp: 25/543b lim: 35 exec/s: 49 rss: 70Mb L: 9/35 MS: 1 CopyPart- 00:09:33.363 [2024-07-24 13:21:52.016676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.363 [2024-07-24 13:21:52.016711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:33.363 [2024-07-24 13:21:52.016780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.363 [2024-07-24 13:21:52.016801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:33.363 #50 NEW cov: 11753 ft: 14834 corp: 26/560b lim: 35 exec/s: 50 rss: 70Mb L: 17/35 MS: 1 InsertByte- 00:09:33.363 [2024-07-24 13:21:52.077249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000db00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.363 [2024-07-24 13:21:52.077284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:33.363 [2024-07-24 13:21:52.077352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:20000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.363 [2024-07-24 13:21:52.077371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:33.363 [2024-07-24 13:21:52.077439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.363 [2024-07-24 13:21:52.077458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:33.363 [2024-07-24 13:21:52.077522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.363 [2024-07-24 13:21:52.077542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:33.363 #51 NEW cov: 11760 ft: 14848 corp: 27/592b lim: 35 exec/s: 51 rss: 70Mb L: 32/35 MS: 1 ChangeBinInt- 00:09:33.363 [2024-07-24 13:21:52.137457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00090000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.363 [2024-07-24 13:21:52.137492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:33.363 [2024-07-24 13:21:52.137561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.363 [2024-07-24 13:21:52.137582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:33.363 [2024-07-24 13:21:52.137648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.363 [2024-07-24 13:21:52.137667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:33.363 [2024-07-24 13:21:52.137731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.363 [2024-07-24 13:21:52.137752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:33.363 #52 NEW cov: 11760 ft: 14856 corp: 28/625b lim: 35 exec/s: 26 rss: 70Mb L: 33/35 MS: 1 PersAutoDict- DE: "\011\000\000\000"- 00:09:33.363 #52 DONE cov: 11760 ft: 14856 corp: 28/625b lim: 35 exec/s: 26 rss: 70Mb 00:09:33.363 ###### Recommended dictionary. ###### 00:09:33.363 "\001\000\000\000\000\000\000\000" # Uses: 2 00:09:33.363 "\000\000" # Uses: 0 00:09:33.363 "\011\000\000\000" # Uses: 2 00:09:33.363 ###### End of recommended dictionary. ###### 00:09:33.363 Done 52 runs in 2 second(s) 00:09:33.622 13:21:52 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:09:33.622 13:21:52 -- ../common.sh@72 -- # (( i++ )) 00:09:33.622 13:21:52 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:33.622 13:21:52 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:09:33.622 13:21:52 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:09:33.622 13:21:52 -- nvmf/run.sh@24 -- # local timen=1 00:09:33.622 13:21:52 -- nvmf/run.sh@25 -- # local core=0x1 00:09:33.622 13:21:52 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:09:33.622 13:21:52 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:09:33.622 13:21:52 -- nvmf/run.sh@29 -- # printf %02d 5 00:09:33.622 13:21:52 -- nvmf/run.sh@29 -- # port=4405 00:09:33.622 13:21:52 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:09:33.622 13:21:52 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:09:33.622 13:21:52 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:33.622 13:21:52 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:09:33.622 [2024-07-24 13:21:52.359515] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:33.622 [2024-07-24 13:21:52.359616] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3168989 ] 00:09:33.622 EAL: No free 2048 kB hugepages reported on node 1 00:09:33.881 [2024-07-24 13:21:52.602069] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:33.881 [2024-07-24 13:21:52.628050] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:33.881 [2024-07-24 13:21:52.628230] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:33.881 [2024-07-24 13:21:52.682772] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:33.881 [2024-07-24 13:21:52.699005] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:09:33.881 INFO: Running with entropic power schedule (0xFF, 100). 00:09:33.881 INFO: Seed: 430468177 00:09:33.881 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:09:33.881 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:09:33.881 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:09:33.881 INFO: A corpus is not provided, starting from an empty corpus 00:09:33.881 #2 INITED exec/s: 0 rss: 61Mb 00:09:33.881 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:33.881 This may also happen if the target rejected all inputs we tried so far 00:09:34.139 [2024-07-24 13:21:52.754336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.139 [2024-07-24 13:21:52.754384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.139 [2024-07-24 13:21:52.754435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.139 [2024-07-24 13:21:52.754459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:34.139 [2024-07-24 13:21:52.754504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.139 [2024-07-24 13:21:52.754527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:34.398 NEW_FUNC[1/671]: 0x4a6970 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:09:34.398 NEW_FUNC[2/671]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:34.398 #12 NEW cov: 11543 ft: 11542 corp: 2/33b lim: 45 exec/s: 0 rss: 68Mb L: 32/32 MS: 5 CopyPart-ShuffleBytes-CrossOver-InsertByte-InsertRepeatedBytes- 00:09:34.398 [2024-07-24 13:21:53.115241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.398 [2024-07-24 13:21:53.115300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.398 [2024-07-24 13:21:53.115349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:20757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.398 [2024-07-24 13:21:53.115374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:34.398 [2024-07-24 13:21:53.115418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.398 [2024-07-24 13:21:53.115442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:34.398 #13 NEW cov: 11656 ft: 11895 corp: 3/65b lim: 45 exec/s: 0 rss: 68Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:34.398 [2024-07-24 13:21:53.215441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.398 [2024-07-24 13:21:53.215488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.398 [2024-07-24 13:21:53.215537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.398 [2024-07-24 13:21:53.215563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:34.398 [2024-07-24 13:21:53.215606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.398 [2024-07-24 13:21:53.215630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:34.398 [2024-07-24 13:21:53.215673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.398 [2024-07-24 13:21:53.215697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:34.703 #19 NEW cov: 11662 ft: 12495 corp: 4/106b lim: 45 exec/s: 0 rss: 68Mb L: 41/41 MS: 1 InsertRepeatedBytes- 00:09:34.703 [2024-07-24 13:21:53.295630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.703 [2024-07-24 13:21:53.295678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.703 [2024-07-24 13:21:53.295727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.703 [2024-07-24 13:21:53.295753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:34.703 [2024-07-24 13:21:53.295796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.703 [2024-07-24 13:21:53.295820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:34.703 [2024-07-24 13:21:53.295863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.703 [2024-07-24 13:21:53.295887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:34.703 #20 NEW cov: 11747 ft: 12790 corp: 5/147b lim: 45 exec/s: 0 rss: 68Mb L: 41/41 MS: 1 ChangeByte- 00:09:34.703 [2024-07-24 13:21:53.395722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:75750a75 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.703 [2024-07-24 13:21:53.395768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.703 [2024-07-24 13:21:53.395816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:48487548 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.703 [2024-07-24 13:21:53.395840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:34.703 #26 NEW cov: 11747 ft: 13142 corp: 6/167b lim: 45 exec/s: 0 rss: 68Mb L: 20/41 MS: 1 CrossOver- 00:09:34.703 [2024-07-24 13:21:53.476085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.703 [2024-07-24 13:21:53.476129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.703 [2024-07-24 13:21:53.476178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.703 [2024-07-24 13:21:53.476204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:34.703 [2024-07-24 13:21:53.476261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.703 [2024-07-24 13:21:53.476286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:34.703 [2024-07-24 13:21:53.476330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.704 [2024-07-24 13:21:53.476353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:34.704 #27 NEW cov: 11747 ft: 13187 corp: 7/208b lim: 45 exec/s: 0 rss: 69Mb L: 41/41 MS: 1 ChangeBit- 00:09:34.704 [2024-07-24 13:21:53.546027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:056e0a3b cdw11:a7910007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.704 [2024-07-24 13:21:53.546073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.963 #28 NEW cov: 11747 ft: 13962 corp: 8/217b lim: 45 exec/s: 0 rss: 69Mb L: 9/41 MS: 1 CMP- DE: ";\005n\247\221\347-\000"- 00:09:34.963 [2024-07-24 13:21:53.626301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:75750a75 cdw11:8b8a0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.963 [2024-07-24 13:21:53.626345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.963 [2024-07-24 13:21:53.626394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ad488ab7 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.963 [2024-07-24 13:21:53.626420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:34.963 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:34.963 #29 NEW cov: 11764 ft: 14068 corp: 9/237b lim: 45 exec/s: 0 rss: 69Mb L: 20/41 MS: 1 ChangeBinInt- 00:09:34.963 [2024-07-24 13:21:53.716458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.963 [2024-07-24 13:21:53.716502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.963 #30 NEW cov: 11764 ft: 14156 corp: 10/254b lim: 45 exec/s: 30 rss: 69Mb L: 17/41 MS: 1 CrossOver- 00:09:34.963 [2024-07-24 13:21:53.797000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:056e0a3b cdw11:a7910007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.963 [2024-07-24 13:21:53.797043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.963 [2024-07-24 13:21:53.797093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.963 [2024-07-24 13:21:53.797117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:34.963 [2024-07-24 13:21:53.797162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.963 [2024-07-24 13:21:53.797185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:34.963 [2024-07-24 13:21:53.797238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.963 [2024-07-24 13:21:53.797261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:35.221 #31 NEW cov: 11764 ft: 14278 corp: 11/296b lim: 45 exec/s: 31 rss: 69Mb L: 42/42 MS: 1 InsertRepeatedBytes- 00:09:35.221 [2024-07-24 13:21:53.887264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:eeee0a75 cdw11:eeee0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.221 [2024-07-24 13:21:53.887307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.221 [2024-07-24 13:21:53.887356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:eeeeeeee cdw11:eeee0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.221 [2024-07-24 13:21:53.887380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:35.221 [2024-07-24 13:21:53.887424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:758bee75 cdw11:8a8a0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.221 [2024-07-24 13:21:53.887448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:35.221 [2024-07-24 13:21:53.887490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:4848b7ad cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.221 [2024-07-24 13:21:53.887513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:35.221 #32 NEW cov: 11764 ft: 14349 corp: 12/333b lim: 45 exec/s: 32 rss: 69Mb L: 37/42 MS: 1 InsertRepeatedBytes- 00:09:35.221 [2024-07-24 13:21:53.957972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:75750a75 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.221 [2024-07-24 13:21:53.958008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.221 [2024-07-24 13:21:53.958078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:48487548 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.221 [2024-07-24 13:21:53.958099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:35.221 #33 NEW cov: 11764 ft: 14470 corp: 13/352b lim: 45 exec/s: 33 rss: 69Mb L: 19/42 MS: 1 EraseBytes- 00:09:35.221 [2024-07-24 13:21:54.008001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:6ea73b05 cdw11:91e70001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.221 [2024-07-24 13:21:54.008038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.221 #35 NEW cov: 11764 ft: 14481 corp: 14/361b lim: 45 exec/s: 35 rss: 69Mb L: 9/42 MS: 2 ShuffleBytes-PersAutoDict- DE: ";\005n\247\221\347-\000"- 00:09:35.221 [2024-07-24 13:21:54.058327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:75750a75 cdw11:8b8a0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.221 [2024-07-24 13:21:54.058362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.221 [2024-07-24 13:21:54.058432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:75758ab7 cdw11:8b8a0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.221 [2024-07-24 13:21:54.058453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:35.480 #36 NEW cov: 11764 ft: 14497 corp: 15/381b lim: 45 exec/s: 36 rss: 69Mb L: 20/42 MS: 1 CopyPart- 00:09:35.480 [2024-07-24 13:21:54.108251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:6ea73b05 cdw11:91e70001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.480 [2024-07-24 13:21:54.108285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.480 #40 NEW cov: 11764 ft: 14545 corp: 16/391b lim: 45 exec/s: 40 rss: 69Mb L: 10/42 MS: 4 CopyPart-CrossOver-ShuffleBytes-PersAutoDict- DE: ";\005n\247\221\347-\000"- 00:09:35.480 [2024-07-24 13:21:54.158769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:75750a75 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.480 [2024-07-24 13:21:54.158804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.480 [2024-07-24 13:21:54.158876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:48487548 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.480 [2024-07-24 13:21:54.158896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:35.480 [2024-07-24 13:21:54.158968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.480 [2024-07-24 13:21:54.158987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:35.480 #41 NEW cov: 11764 ft: 14556 corp: 17/424b lim: 45 exec/s: 41 rss: 69Mb L: 33/42 MS: 1 CopyPart- 00:09:35.480 [2024-07-24 13:21:54.209145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.480 [2024-07-24 13:21:54.209180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.480 [2024-07-24 13:21:54.209252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.480 [2024-07-24 13:21:54.209273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:35.480 [2024-07-24 13:21:54.209339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.480 [2024-07-24 13:21:54.209358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:35.480 [2024-07-24 13:21:54.209427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.480 [2024-07-24 13:21:54.209448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:35.480 #42 NEW cov: 11764 ft: 14595 corp: 18/465b lim: 45 exec/s: 42 rss: 69Mb L: 41/42 MS: 1 CrossOver- 00:09:35.480 [2024-07-24 13:21:54.259241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:056e0a3b cdw11:a7910007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.480 [2024-07-24 13:21:54.259275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.480 [2024-07-24 13:21:54.259348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.480 [2024-07-24 13:21:54.259369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:35.480 [2024-07-24 13:21:54.259439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.480 [2024-07-24 13:21:54.259459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:35.480 [2024-07-24 13:21:54.259526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.480 [2024-07-24 13:21:54.259546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:35.480 #43 NEW cov: 11764 ft: 14621 corp: 19/501b lim: 45 exec/s: 43 rss: 69Mb L: 36/42 MS: 1 EraseBytes- 00:09:35.480 [2024-07-24 13:21:54.319055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:75750a75 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.480 [2024-07-24 13:21:54.319091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.480 [2024-07-24 13:21:54.319161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:48487548 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.480 [2024-07-24 13:21:54.319182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:35.739 #44 NEW cov: 11764 ft: 14623 corp: 20/520b lim: 45 exec/s: 44 rss: 69Mb L: 19/42 MS: 1 ChangeBit- 00:09:35.739 [2024-07-24 13:21:54.379068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:75750a75 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.740 [2024-07-24 13:21:54.379103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.740 #45 NEW cov: 11764 ft: 14652 corp: 21/534b lim: 45 exec/s: 45 rss: 69Mb L: 14/42 MS: 1 CrossOver- 00:09:35.740 [2024-07-24 13:21:54.439766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.740 [2024-07-24 13:21:54.439802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.740 [2024-07-24 13:21:54.439872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00290002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.740 [2024-07-24 13:21:54.439892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:35.740 [2024-07-24 13:21:54.439960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.740 [2024-07-24 13:21:54.439981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:35.740 [2024-07-24 13:21:54.440047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.740 [2024-07-24 13:21:54.440071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:35.740 #46 NEW cov: 11764 ft: 14679 corp: 22/575b lim: 45 exec/s: 46 rss: 69Mb L: 41/42 MS: 1 ChangeBinInt- 00:09:35.740 [2024-07-24 13:21:54.499988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:75750a75 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.740 [2024-07-24 13:21:54.500023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.740 [2024-07-24 13:21:54.500094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:48487548 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.740 [2024-07-24 13:21:54.500114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:35.740 [2024-07-24 13:21:54.500181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e8e875e8 cdw11:e8e80007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.740 [2024-07-24 13:21:54.500202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:35.740 [2024-07-24 13:21:54.500275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7575e875 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.740 [2024-07-24 13:21:54.500295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:35.740 #47 NEW cov: 11764 ft: 14697 corp: 23/617b lim: 45 exec/s: 47 rss: 69Mb L: 42/42 MS: 1 InsertRepeatedBytes- 00:09:35.740 [2024-07-24 13:21:54.559566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:75750a75 cdw11:75750002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.740 [2024-07-24 13:21:54.559602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.740 #48 NEW cov: 11764 ft: 14710 corp: 24/632b lim: 45 exec/s: 48 rss: 69Mb L: 15/42 MS: 1 EraseBytes- 00:09:35.999 [2024-07-24 13:21:54.610444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:48484848 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.999 [2024-07-24 13:21:54.610481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.999 [2024-07-24 13:21:54.610552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:75757575 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.999 [2024-07-24 13:21:54.610573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:35.999 [2024-07-24 13:21:54.610642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.999 [2024-07-24 13:21:54.610662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:35.999 [2024-07-24 13:21:54.610730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.999 [2024-07-24 13:21:54.610751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:35.999 [2024-07-24 13:21:54.610819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.999 [2024-07-24 13:21:54.610839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:35.999 #49 NEW cov: 11771 ft: 14785 corp: 25/677b lim: 45 exec/s: 49 rss: 69Mb L: 45/45 MS: 1 CopyPart- 00:09:35.999 [2024-07-24 13:21:54.659802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:75750a0a cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.999 [2024-07-24 13:21:54.659842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.999 #50 NEW cov: 11771 ft: 14799 corp: 26/687b lim: 45 exec/s: 50 rss: 69Mb L: 10/45 MS: 1 CrossOver- 00:09:35.999 [2024-07-24 13:21:54.710566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.999 [2024-07-24 13:21:54.710602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.999 [2024-07-24 13:21:54.710673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.999 [2024-07-24 13:21:54.710694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:35.999 [2024-07-24 13:21:54.710762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:75757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.999 [2024-07-24 13:21:54.710782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:35.999 [2024-07-24 13:21:54.710849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:76757575 cdw11:75750003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.999 [2024-07-24 13:21:54.710869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:35.999 #51 NEW cov: 11771 ft: 14813 corp: 27/728b lim: 45 exec/s: 25 rss: 69Mb L: 41/45 MS: 1 ChangeBinInt- 00:09:35.999 #51 DONE cov: 11771 ft: 14813 corp: 27/728b lim: 45 exec/s: 25 rss: 69Mb 00:09:35.999 ###### Recommended dictionary. ###### 00:09:35.999 ";\005n\247\221\347-\000" # Uses: 2 00:09:35.999 ###### End of recommended dictionary. ###### 00:09:35.999 Done 51 runs in 2 second(s) 00:09:36.258 13:21:54 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:09:36.258 13:21:54 -- ../common.sh@72 -- # (( i++ )) 00:09:36.258 13:21:54 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:36.258 13:21:54 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:09:36.258 13:21:54 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:09:36.258 13:21:54 -- nvmf/run.sh@24 -- # local timen=1 00:09:36.258 13:21:54 -- nvmf/run.sh@25 -- # local core=0x1 00:09:36.258 13:21:54 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:09:36.258 13:21:54 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:09:36.258 13:21:54 -- nvmf/run.sh@29 -- # printf %02d 6 00:09:36.258 13:21:54 -- nvmf/run.sh@29 -- # port=4406 00:09:36.258 13:21:54 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:09:36.258 13:21:54 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:09:36.258 13:21:54 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:36.258 13:21:54 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:09:36.258 [2024-07-24 13:21:54.933350] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:36.258 [2024-07-24 13:21:54.933425] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3169355 ] 00:09:36.258 EAL: No free 2048 kB hugepages reported on node 1 00:09:36.517 [2024-07-24 13:21:55.189464] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:36.517 [2024-07-24 13:21:55.215602] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:36.517 [2024-07-24 13:21:55.215775] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:36.517 [2024-07-24 13:21:55.270298] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:36.517 [2024-07-24 13:21:55.286521] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:09:36.517 INFO: Running with entropic power schedule (0xFF, 100). 00:09:36.517 INFO: Seed: 3018467808 00:09:36.517 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:09:36.517 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:09:36.517 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:09:36.517 INFO: A corpus is not provided, starting from an empty corpus 00:09:36.517 #2 INITED exec/s: 0 rss: 61Mb 00:09:36.517 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:36.517 This may also happen if the target rejected all inputs we tried so far 00:09:36.517 [2024-07-24 13:21:55.341613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:36.517 [2024-07-24 13:21:55.341660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.035 NEW_FUNC[1/667]: 0x4a9180 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:09:37.035 NEW_FUNC[2/667]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:37.035 #3 NEW cov: 11453 ft: 11461 corp: 2/3b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CopyPart- 00:09:37.035 [2024-07-24 13:21:55.702757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:37.035 [2024-07-24 13:21:55.702815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.035 [2024-07-24 13:21:55.702863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:37.035 [2024-07-24 13:21:55.702889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:37.035 [2024-07-24 13:21:55.702934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:37.035 [2024-07-24 13:21:55.702959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:37.035 [2024-07-24 13:21:55.703003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:37.035 [2024-07-24 13:21:55.703026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:37.035 NEW_FUNC[1/2]: 0x1c80d10 in thread_update_stats /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:918 00:09:37.035 NEW_FUNC[2/2]: 0x1c82bc0 in spdk_thread_get_last_tsc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1310 00:09:37.035 #4 NEW cov: 11573 ft: 12269 corp: 3/11b lim: 10 exec/s: 0 rss: 68Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:09:37.035 [2024-07-24 13:21:55.802857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:37.035 [2024-07-24 13:21:55.802904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.035 [2024-07-24 13:21:55.802951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:37.035 [2024-07-24 13:21:55.802977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:37.035 [2024-07-24 13:21:55.803018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff00 cdw11:00000000 00:09:37.035 [2024-07-24 13:21:55.803048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:37.035 [2024-07-24 13:21:55.803091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:37.035 [2024-07-24 13:21:55.803114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:37.035 #5 NEW cov: 11579 ft: 12515 corp: 4/19b lim: 10 exec/s: 0 rss: 69Mb L: 8/8 MS: 1 CMP- DE: "\377\000"- 00:09:37.035 [2024-07-24 13:21:55.893142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:37.035 [2024-07-24 13:21:55.893188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.035 [2024-07-24 13:21:55.893245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:37.035 [2024-07-24 13:21:55.893272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:37.035 [2024-07-24 13:21:55.893314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff00 cdw11:00000000 00:09:37.035 [2024-07-24 13:21:55.893338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:37.035 [2024-07-24 13:21:55.893380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:37.035 [2024-07-24 13:21:55.893405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:37.036 [2024-07-24 13:21:55.893447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:37.036 [2024-07-24 13:21:55.893472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:37.295 #6 NEW cov: 11664 ft: 12785 corp: 5/29b lim: 10 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 CrossOver- 00:09:37.295 [2024-07-24 13:21:55.993163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002a2a cdw11:00000000 00:09:37.295 [2024-07-24 13:21:55.993208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.295 #8 NEW cov: 11664 ft: 12865 corp: 6/31b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 2 ChangeBit-CopyPart- 00:09:37.295 [2024-07-24 13:21:56.063371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ff00 cdw11:00000000 00:09:37.295 [2024-07-24 13:21:56.063416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.295 #10 NEW cov: 11664 ft: 13014 corp: 7/34b lim: 10 exec/s: 0 rss: 69Mb L: 3/10 MS: 2 ShuffleBytes-PersAutoDict- DE: "\377\000"- 00:09:37.295 [2024-07-24 13:21:56.133583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000290a cdw11:00000000 00:09:37.295 [2024-07-24 13:21:56.133627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.554 #11 NEW cov: 11664 ft: 13087 corp: 8/36b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 InsertByte- 00:09:37.554 [2024-07-24 13:21:56.203783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:37.554 [2024-07-24 13:21:56.203827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.554 [2024-07-24 13:21:56.203874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:09:37.554 [2024-07-24 13:21:56.203899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:37.554 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:37.554 #12 NEW cov: 11681 ft: 13295 corp: 9/41b lim: 10 exec/s: 0 rss: 69Mb L: 5/10 MS: 1 EraseBytes- 00:09:37.554 [2024-07-24 13:21:56.273905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ff80 cdw11:00000000 00:09:37.554 [2024-07-24 13:21:56.273949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.554 #13 NEW cov: 11681 ft: 13313 corp: 10/44b lim: 10 exec/s: 13 rss: 69Mb L: 3/10 MS: 1 ChangeBit- 00:09:37.554 [2024-07-24 13:21:56.364278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:37.554 [2024-07-24 13:21:56.364324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.554 [2024-07-24 13:21:56.364371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:09:37.554 [2024-07-24 13:21:56.364396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:37.812 #14 NEW cov: 11681 ft: 13373 corp: 11/48b lim: 10 exec/s: 14 rss: 69Mb L: 4/10 MS: 1 CrossOver- 00:09:37.812 [2024-07-24 13:21:56.454426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ff0a cdw11:00000000 00:09:37.813 [2024-07-24 13:21:56.454470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.813 #15 NEW cov: 11681 ft: 13385 corp: 12/50b lim: 10 exec/s: 15 rss: 69Mb L: 2/10 MS: 1 CrossOver- 00:09:37.813 [2024-07-24 13:21:56.534613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003a0a cdw11:00000000 00:09:37.813 [2024-07-24 13:21:56.534656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.813 #16 NEW cov: 11681 ft: 13414 corp: 13/52b lim: 10 exec/s: 16 rss: 69Mb L: 2/10 MS: 1 ChangeByte- 00:09:37.813 [2024-07-24 13:21:56.605055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:37.813 [2024-07-24 13:21:56.605098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.813 [2024-07-24 13:21:56.605145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:37.813 [2024-07-24 13:21:56.605170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:37.813 [2024-07-24 13:21:56.605223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00003e00 cdw11:00000000 00:09:37.813 [2024-07-24 13:21:56.605247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:37.813 [2024-07-24 13:21:56.605289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:37.813 [2024-07-24 13:21:56.605312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:37.813 #17 NEW cov: 11681 ft: 13447 corp: 14/60b lim: 10 exec/s: 17 rss: 69Mb L: 8/10 MS: 1 ChangeByte- 00:09:37.813 [2024-07-24 13:21:56.674993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002a2a cdw11:00000000 00:09:37.813 [2024-07-24 13:21:56.675036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:38.071 #18 NEW cov: 11681 ft: 13484 corp: 15/63b lim: 10 exec/s: 18 rss: 69Mb L: 3/10 MS: 1 InsertByte- 00:09:38.071 [2024-07-24 13:21:56.765364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:38.071 [2024-07-24 13:21:56.765407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:38.072 [2024-07-24 13:21:56.765461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:38.072 [2024-07-24 13:21:56.765485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:38.072 #19 NEW cov: 11681 ft: 13544 corp: 16/67b lim: 10 exec/s: 19 rss: 69Mb L: 4/10 MS: 1 EraseBytes- 00:09:38.072 [2024-07-24 13:21:56.855748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004646 cdw11:00000000 00:09:38.072 [2024-07-24 13:21:56.855791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:38.072 [2024-07-24 13:21:56.855839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004646 cdw11:00000000 00:09:38.072 [2024-07-24 13:21:56.855864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:38.072 [2024-07-24 13:21:56.855906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00004646 cdw11:00000000 00:09:38.072 [2024-07-24 13:21:56.855931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:38.072 [2024-07-24 13:21:56.855973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:000046ff cdw11:00000000 00:09:38.072 [2024-07-24 13:21:56.855996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:38.072 #20 NEW cov: 11681 ft: 13561 corp: 17/76b lim: 10 exec/s: 20 rss: 69Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:09:38.330 [2024-07-24 13:21:56.945769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ff00 cdw11:00000000 00:09:38.330 [2024-07-24 13:21:56.945813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:38.330 #21 NEW cov: 11681 ft: 13622 corp: 18/78b lim: 10 exec/s: 21 rss: 69Mb L: 2/10 MS: 1 EraseBytes- 00:09:38.330 [2024-07-24 13:21:57.016260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002e46 cdw11:00000000 00:09:38.330 [2024-07-24 13:21:57.016304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:38.330 [2024-07-24 13:21:57.016351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004646 cdw11:00000000 00:09:38.330 [2024-07-24 13:21:57.016376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:38.330 [2024-07-24 13:21:57.016419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00004646 cdw11:00000000 00:09:38.330 [2024-07-24 13:21:57.016444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:38.330 [2024-07-24 13:21:57.016486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00004646 cdw11:00000000 00:09:38.330 [2024-07-24 13:21:57.016510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:38.330 [2024-07-24 13:21:57.016553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000ff0a cdw11:00000000 00:09:38.330 [2024-07-24 13:21:57.016577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:38.330 #22 NEW cov: 11681 ft: 13630 corp: 19/88b lim: 10 exec/s: 22 rss: 69Mb L: 10/10 MS: 1 InsertByte- 00:09:38.330 [2024-07-24 13:21:57.106304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:38.330 [2024-07-24 13:21:57.106347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:38.330 [2024-07-24 13:21:57.106400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:09:38.330 [2024-07-24 13:21:57.106426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:38.330 #23 NEW cov: 11681 ft: 13657 corp: 20/93b lim: 10 exec/s: 23 rss: 69Mb L: 5/10 MS: 1 ChangeByte- 00:09:38.330 [2024-07-24 13:21:57.176411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ff27 cdw11:00000000 00:09:38.330 [2024-07-24 13:21:57.176457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:38.590 #24 NEW cov: 11688 ft: 13686 corp: 21/95b lim: 10 exec/s: 24 rss: 70Mb L: 2/10 MS: 1 ChangeByte- 00:09:38.590 [2024-07-24 13:21:57.266872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:38.590 [2024-07-24 13:21:57.266917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:38.590 [2024-07-24 13:21:57.266965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:38.590 [2024-07-24 13:21:57.266990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:38.590 [2024-07-24 13:21:57.267033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00003e0a cdw11:00000000 00:09:38.590 [2024-07-24 13:21:57.267057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:38.590 [2024-07-24 13:21:57.267099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 00:09:38.590 [2024-07-24 13:21:57.267123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:38.590 #25 NEW cov: 11688 ft: 13771 corp: 22/103b lim: 10 exec/s: 25 rss: 70Mb L: 8/10 MS: 1 ShuffleBytes- 00:09:38.590 [2024-07-24 13:21:57.336820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ff0a cdw11:00000000 00:09:38.590 [2024-07-24 13:21:57.336865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:38.590 #26 NEW cov: 11688 ft: 13796 corp: 23/106b lim: 10 exec/s: 13 rss: 70Mb L: 3/10 MS: 1 EraseBytes- 00:09:38.590 #26 DONE cov: 11688 ft: 13796 corp: 23/106b lim: 10 exec/s: 13 rss: 70Mb 00:09:38.590 ###### Recommended dictionary. ###### 00:09:38.590 "\377\000" # Uses: 1 00:09:38.590 ###### End of recommended dictionary. ###### 00:09:38.590 Done 26 runs in 2 second(s) 00:09:38.849 13:21:57 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:09:38.849 13:21:57 -- ../common.sh@72 -- # (( i++ )) 00:09:38.849 13:21:57 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:38.849 13:21:57 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:09:38.849 13:21:57 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:09:38.849 13:21:57 -- nvmf/run.sh@24 -- # local timen=1 00:09:38.849 13:21:57 -- nvmf/run.sh@25 -- # local core=0x1 00:09:38.849 13:21:57 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:09:38.849 13:21:57 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:09:38.849 13:21:57 -- nvmf/run.sh@29 -- # printf %02d 7 00:09:38.849 13:21:57 -- nvmf/run.sh@29 -- # port=4407 00:09:38.849 13:21:57 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:09:38.849 13:21:57 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:09:38.849 13:21:57 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:38.849 13:21:57 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:09:38.849 [2024-07-24 13:21:57.595061] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:38.849 [2024-07-24 13:21:57.595157] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3169720 ] 00:09:38.849 EAL: No free 2048 kB hugepages reported on node 1 00:09:39.108 [2024-07-24 13:21:57.853247] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:39.108 [2024-07-24 13:21:57.879177] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:39.108 [2024-07-24 13:21:57.879361] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:39.108 [2024-07-24 13:21:57.933876] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:39.108 [2024-07-24 13:21:57.950109] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:09:39.108 INFO: Running with entropic power schedule (0xFF, 100). 00:09:39.108 INFO: Seed: 1387506559 00:09:39.368 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:09:39.368 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:09:39.368 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:09:39.368 INFO: A corpus is not provided, starting from an empty corpus 00:09:39.368 #2 INITED exec/s: 0 rss: 61Mb 00:09:39.368 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:39.368 This may also happen if the target rejected all inputs we tried so far 00:09:39.368 [2024-07-24 13:21:58.006014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007474 cdw11:00000000 00:09:39.368 [2024-07-24 13:21:58.006053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.368 [2024-07-24 13:21:58.006118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007474 cdw11:00000000 00:09:39.368 [2024-07-24 13:21:58.006138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:39.368 [2024-07-24 13:21:58.006205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007474 cdw11:00000000 00:09:39.368 [2024-07-24 13:21:58.006231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:39.627 NEW_FUNC[1/669]: 0x4a9b70 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:09:39.627 NEW_FUNC[2/669]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:39.627 #3 NEW cov: 11460 ft: 11461 corp: 2/8b lim: 10 exec/s: 0 rss: 68Mb L: 7/7 MS: 1 InsertRepeatedBytes- 00:09:39.627 [2024-07-24 13:21:58.476846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a7e cdw11:00000000 00:09:39.627 [2024-07-24 13:21:58.476895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.886 #5 NEW cov: 11573 ft: 12037 corp: 3/10b lim: 10 exec/s: 0 rss: 68Mb L: 2/7 MS: 2 ChangeBit-InsertByte- 00:09:39.886 [2024-07-24 13:21:58.526872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a913 cdw11:00000000 00:09:39.886 [2024-07-24 13:21:58.526910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.886 #10 NEW cov: 11579 ft: 12157 corp: 4/12b lim: 10 exec/s: 0 rss: 68Mb L: 2/7 MS: 5 ChangeBit-ChangeBinInt-ChangeBit-ChangeByte-InsertByte- 00:09:39.886 [2024-07-24 13:21:58.567040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a90a cdw11:00000000 00:09:39.886 [2024-07-24 13:21:58.567080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.886 #11 NEW cov: 11664 ft: 12541 corp: 5/14b lim: 10 exec/s: 0 rss: 68Mb L: 2/7 MS: 1 CrossOver- 00:09:39.886 [2024-07-24 13:21:58.627455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:09:39.886 [2024-07-24 13:21:58.627491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.886 [2024-07-24 13:21:58.627558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:39.886 [2024-07-24 13:21:58.627579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:39.886 [2024-07-24 13:21:58.627643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000a913 cdw11:00000000 00:09:39.886 [2024-07-24 13:21:58.627663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:39.886 #12 NEW cov: 11664 ft: 12707 corp: 6/20b lim: 10 exec/s: 0 rss: 68Mb L: 6/7 MS: 1 InsertRepeatedBytes- 00:09:39.886 [2024-07-24 13:21:58.677359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a90a cdw11:00000000 00:09:39.886 [2024-07-24 13:21:58.677395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.886 #13 NEW cov: 11664 ft: 12840 corp: 7/22b lim: 10 exec/s: 0 rss: 68Mb L: 2/7 MS: 1 ShuffleBytes- 00:09:39.886 [2024-07-24 13:21:58.737791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a90a cdw11:00000000 00:09:39.886 [2024-07-24 13:21:58.737827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.886 [2024-07-24 13:21:58.737894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:39.886 [2024-07-24 13:21:58.737914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:39.886 [2024-07-24 13:21:58.737979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:09:39.886 [2024-07-24 13:21:58.738000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:40.145 #14 NEW cov: 11664 ft: 12915 corp: 8/28b lim: 10 exec/s: 0 rss: 69Mb L: 6/7 MS: 1 InsertRepeatedBytes- 00:09:40.145 [2024-07-24 13:21:58.798005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007474 cdw11:00000000 00:09:40.145 [2024-07-24 13:21:58.798040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.145 [2024-07-24 13:21:58.798105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006074 cdw11:00000000 00:09:40.145 [2024-07-24 13:21:58.798125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.145 [2024-07-24 13:21:58.798191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007474 cdw11:00000000 00:09:40.145 [2024-07-24 13:21:58.798209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:40.145 #15 NEW cov: 11664 ft: 12955 corp: 9/35b lim: 10 exec/s: 0 rss: 69Mb L: 7/7 MS: 1 ChangeByte- 00:09:40.145 [2024-07-24 13:21:58.857830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005ba9 cdw11:00000000 00:09:40.145 [2024-07-24 13:21:58.857866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.145 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:40.145 #16 NEW cov: 11687 ft: 13004 corp: 10/38b lim: 10 exec/s: 0 rss: 69Mb L: 3/7 MS: 1 InsertByte- 00:09:40.145 [2024-07-24 13:21:58.907946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000250a cdw11:00000000 00:09:40.145 [2024-07-24 13:21:58.907981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.145 #17 NEW cov: 11687 ft: 13050 corp: 11/40b lim: 10 exec/s: 0 rss: 69Mb L: 2/7 MS: 1 InsertByte- 00:09:40.145 [2024-07-24 13:21:58.948691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:09:40.145 [2024-07-24 13:21:58.948726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.145 [2024-07-24 13:21:58.948791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:40.145 [2024-07-24 13:21:58.948811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.145 [2024-07-24 13:21:58.948875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002e2e cdw11:00000000 00:09:40.145 [2024-07-24 13:21:58.948895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:40.145 [2024-07-24 13:21:58.948959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00002e2e cdw11:00000000 00:09:40.145 [2024-07-24 13:21:58.948979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:40.145 [2024-07-24 13:21:58.949042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000a913 cdw11:00000000 00:09:40.145 [2024-07-24 13:21:58.949062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:40.145 #18 NEW cov: 11687 ft: 13321 corp: 12/50b lim: 10 exec/s: 18 rss: 69Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:09:40.145 [2024-07-24 13:21:59.008616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007474 cdw11:00000000 00:09:40.145 [2024-07-24 13:21:59.008652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.145 [2024-07-24 13:21:59.008719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007474 cdw11:00000000 00:09:40.145 [2024-07-24 13:21:59.008739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.145 [2024-07-24 13:21:59.008806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007474 cdw11:00000000 00:09:40.145 [2024-07-24 13:21:59.008826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:40.404 #19 NEW cov: 11687 ft: 13344 corp: 13/57b lim: 10 exec/s: 19 rss: 69Mb L: 7/10 MS: 1 CopyPart- 00:09:40.404 [2024-07-24 13:21:59.058446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a90a cdw11:00000000 00:09:40.404 [2024-07-24 13:21:59.058481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.405 #20 NEW cov: 11687 ft: 13369 corp: 14/59b lim: 10 exec/s: 20 rss: 69Mb L: 2/10 MS: 1 CopyPart- 00:09:40.405 [2024-07-24 13:21:59.109144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a913 cdw11:00000000 00:09:40.405 [2024-07-24 13:21:59.109180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.405 [2024-07-24 13:21:59.109251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:40.405 [2024-07-24 13:21:59.109276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.405 [2024-07-24 13:21:59.109339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:40.405 [2024-07-24 13:21:59.109359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:40.405 [2024-07-24 13:21:59.109420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:40.405 [2024-07-24 13:21:59.109440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:40.405 [2024-07-24 13:21:59.109501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:40.405 [2024-07-24 13:21:59.109522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:40.405 #21 NEW cov: 11687 ft: 13412 corp: 15/69b lim: 10 exec/s: 21 rss: 69Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:09:40.405 [2024-07-24 13:21:59.158764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a913 cdw11:00000000 00:09:40.405 [2024-07-24 13:21:59.158799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.405 #22 NEW cov: 11687 ft: 13455 corp: 16/72b lim: 10 exec/s: 22 rss: 69Mb L: 3/10 MS: 1 InsertByte- 00:09:40.405 [2024-07-24 13:21:59.198865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000e0a cdw11:00000000 00:09:40.405 [2024-07-24 13:21:59.198899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.405 #23 NEW cov: 11687 ft: 13469 corp: 17/74b lim: 10 exec/s: 23 rss: 69Mb L: 2/10 MS: 1 ChangeByte- 00:09:40.405 [2024-07-24 13:21:59.259053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000680e cdw11:00000000 00:09:40.405 [2024-07-24 13:21:59.259087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.664 #24 NEW cov: 11687 ft: 13494 corp: 18/77b lim: 10 exec/s: 24 rss: 69Mb L: 3/10 MS: 1 InsertByte- 00:09:40.664 [2024-07-24 13:21:59.319610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a900 cdw11:00000000 00:09:40.664 [2024-07-24 13:21:59.319644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.664 [2024-07-24 13:21:59.319710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:40.664 [2024-07-24 13:21:59.319730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.664 [2024-07-24 13:21:59.319795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:09:40.664 [2024-07-24 13:21:59.319813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:40.664 [2024-07-24 13:21:59.319877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 00:09:40.664 [2024-07-24 13:21:59.319897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:40.664 #25 NEW cov: 11687 ft: 13556 corp: 19/85b lim: 10 exec/s: 25 rss: 69Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:09:40.664 [2024-07-24 13:21:59.369355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a90a cdw11:00000000 00:09:40.664 [2024-07-24 13:21:59.369389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.664 #26 NEW cov: 11687 ft: 13563 corp: 20/87b lim: 10 exec/s: 26 rss: 69Mb L: 2/10 MS: 1 ShuffleBytes- 00:09:40.664 [2024-07-24 13:21:59.410084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:09:40.664 [2024-07-24 13:21:59.410119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.664 [2024-07-24 13:21:59.410184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:40.664 [2024-07-24 13:21:59.410204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.664 [2024-07-24 13:21:59.410272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002e2e cdw11:00000000 00:09:40.664 [2024-07-24 13:21:59.410291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:40.664 [2024-07-24 13:21:59.410354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00002e2e cdw11:00000000 00:09:40.664 [2024-07-24 13:21:59.410374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:40.664 [2024-07-24 13:21:59.410435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000a95d cdw11:00000000 00:09:40.664 [2024-07-24 13:21:59.410455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:40.664 #27 NEW cov: 11687 ft: 13576 corp: 21/97b lim: 10 exec/s: 27 rss: 69Mb L: 10/10 MS: 1 ChangeByte- 00:09:40.664 [2024-07-24 13:21:59.470225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a9ee cdw11:00000000 00:09:40.664 [2024-07-24 13:21:59.470260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.664 [2024-07-24 13:21:59.470325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:40.665 [2024-07-24 13:21:59.470344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.665 [2024-07-24 13:21:59.470406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:40.665 [2024-07-24 13:21:59.470425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:40.665 [2024-07-24 13:21:59.470489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:40.665 [2024-07-24 13:21:59.470509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:40.665 [2024-07-24 13:21:59.470574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:40.665 [2024-07-24 13:21:59.470595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:40.665 #33 NEW cov: 11687 ft: 13587 corp: 22/107b lim: 10 exec/s: 33 rss: 69Mb L: 10/10 MS: 1 ChangeBinInt- 00:09:40.924 [2024-07-24 13:21:59.530269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:09:40.924 [2024-07-24 13:21:59.530304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.924 [2024-07-24 13:21:59.530368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000250a cdw11:00000000 00:09:40.924 [2024-07-24 13:21:59.530388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.924 [2024-07-24 13:21:59.530449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:09:40.924 [2024-07-24 13:21:59.530472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:40.924 [2024-07-24 13:21:59.530535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000a913 cdw11:00000000 00:09:40.924 [2024-07-24 13:21:59.530554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:40.924 #34 NEW cov: 11687 ft: 13599 corp: 23/115b lim: 10 exec/s: 34 rss: 69Mb L: 8/10 MS: 1 CrossOver- 00:09:40.924 [2024-07-24 13:21:59.580149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002e2e cdw11:00000000 00:09:40.924 [2024-07-24 13:21:59.580183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.924 [2024-07-24 13:21:59.580254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002ea9 cdw11:00000000 00:09:40.924 [2024-07-24 13:21:59.580275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.924 #35 NEW cov: 11687 ft: 13833 corp: 24/120b lim: 10 exec/s: 35 rss: 70Mb L: 5/10 MS: 1 EraseBytes- 00:09:40.924 [2024-07-24 13:21:59.640177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a900 cdw11:00000000 00:09:40.924 [2024-07-24 13:21:59.640218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.925 #36 NEW cov: 11687 ft: 13847 corp: 25/122b lim: 10 exec/s: 36 rss: 70Mb L: 2/10 MS: 1 ChangeBinInt- 00:09:40.925 [2024-07-24 13:21:59.690628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a3a3 cdw11:00000000 00:09:40.925 [2024-07-24 13:21:59.690662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.925 [2024-07-24 13:21:59.690729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000a368 cdw11:00000000 00:09:40.925 [2024-07-24 13:21:59.690749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.925 [2024-07-24 13:21:59.690814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000e0a cdw11:00000000 00:09:40.925 [2024-07-24 13:21:59.690833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:40.925 #37 NEW cov: 11687 ft: 13887 corp: 26/128b lim: 10 exec/s: 37 rss: 70Mb L: 6/10 MS: 1 InsertRepeatedBytes- 00:09:40.925 [2024-07-24 13:21:59.750917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a7e cdw11:00000000 00:09:40.925 [2024-07-24 13:21:59.750951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.925 [2024-07-24 13:21:59.751017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:40.925 [2024-07-24 13:21:59.751036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.925 [2024-07-24 13:21:59.751100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:40.925 [2024-07-24 13:21:59.751119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:40.925 [2024-07-24 13:21:59.751182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:40.925 [2024-07-24 13:21:59.751203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:41.184 #38 NEW cov: 11687 ft: 13899 corp: 27/136b lim: 10 exec/s: 38 rss: 70Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:09:41.184 [2024-07-24 13:21:59.810616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aa9 cdw11:00000000 00:09:41.184 [2024-07-24 13:21:59.810651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.184 #41 NEW cov: 11687 ft: 13913 corp: 28/138b lim: 10 exec/s: 41 rss: 70Mb L: 2/10 MS: 3 EraseBytes-ShuffleBytes-CrossOver- 00:09:41.184 [2024-07-24 13:21:59.851065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000f00 cdw11:00000000 00:09:41.184 [2024-07-24 13:21:59.851101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.184 [2024-07-24 13:21:59.851164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:41.184 [2024-07-24 13:21:59.851184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:41.184 [2024-07-24 13:21:59.851253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000a913 cdw11:00000000 00:09:41.184 [2024-07-24 13:21:59.851274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:41.184 #42 NEW cov: 11687 ft: 14004 corp: 29/144b lim: 10 exec/s: 42 rss: 70Mb L: 6/10 MS: 1 CMP- DE: "\017\000\000\000"- 00:09:41.184 [2024-07-24 13:21:59.900942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a90a cdw11:00000000 00:09:41.184 [2024-07-24 13:21:59.900978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.184 #43 NEW cov: 11687 ft: 14034 corp: 30/147b lim: 10 exec/s: 43 rss: 70Mb L: 3/10 MS: 1 InsertByte- 00:09:41.184 [2024-07-24 13:21:59.961089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a982 cdw11:00000000 00:09:41.184 [2024-07-24 13:21:59.961123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.184 #44 NEW cov: 11687 ft: 14044 corp: 31/150b lim: 10 exec/s: 22 rss: 70Mb L: 3/10 MS: 1 InsertByte- 00:09:41.184 #44 DONE cov: 11687 ft: 14044 corp: 31/150b lim: 10 exec/s: 22 rss: 70Mb 00:09:41.184 ###### Recommended dictionary. ###### 00:09:41.184 "\017\000\000\000" # Uses: 0 00:09:41.184 ###### End of recommended dictionary. ###### 00:09:41.184 Done 44 runs in 2 second(s) 00:09:41.443 13:22:00 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:09:41.443 13:22:00 -- ../common.sh@72 -- # (( i++ )) 00:09:41.443 13:22:00 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:41.443 13:22:00 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:09:41.443 13:22:00 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:09:41.443 13:22:00 -- nvmf/run.sh@24 -- # local timen=1 00:09:41.443 13:22:00 -- nvmf/run.sh@25 -- # local core=0x1 00:09:41.443 13:22:00 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:09:41.443 13:22:00 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:09:41.443 13:22:00 -- nvmf/run.sh@29 -- # printf %02d 8 00:09:41.443 13:22:00 -- nvmf/run.sh@29 -- # port=4408 00:09:41.443 13:22:00 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:09:41.443 13:22:00 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:09:41.443 13:22:00 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:41.443 13:22:00 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:09:41.443 [2024-07-24 13:22:00.171893] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:41.443 [2024-07-24 13:22:00.171968] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3170060 ] 00:09:41.443 EAL: No free 2048 kB hugepages reported on node 1 00:09:41.702 [2024-07-24 13:22:00.426320] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:41.702 [2024-07-24 13:22:00.452474] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:41.702 [2024-07-24 13:22:00.452653] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:41.702 [2024-07-24 13:22:00.507197] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:41.702 [2024-07-24 13:22:00.523436] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:09:41.702 INFO: Running with entropic power schedule (0xFF, 100). 00:09:41.702 INFO: Seed: 3960501243 00:09:41.702 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:09:41.702 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:09:41.702 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:09:41.702 INFO: A corpus is not provided, starting from an empty corpus 00:09:41.961 [2024-07-24 13:22:00.579042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.961 [2024-07-24 13:22:00.579081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.961 #2 INITED cov: 11488 ft: 11489 corp: 1/1b exec/s: 0 rss: 67Mb 00:09:41.961 [2024-07-24 13:22:00.619013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.961 [2024-07-24 13:22:00.619049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.961 #3 NEW cov: 11601 ft: 12016 corp: 2/2b lim: 5 exec/s: 0 rss: 67Mb L: 1/1 MS: 1 ChangeBinInt- 00:09:41.961 [2024-07-24 13:22:00.679183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.961 [2024-07-24 13:22:00.679224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.961 #4 NEW cov: 11607 ft: 12143 corp: 3/3b lim: 5 exec/s: 0 rss: 67Mb L: 1/1 MS: 1 ChangeByte- 00:09:41.961 [2024-07-24 13:22:00.729520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.961 [2024-07-24 13:22:00.729555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.962 [2024-07-24 13:22:00.729626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.962 [2024-07-24 13:22:00.729647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:41.962 #5 NEW cov: 11692 ft: 13096 corp: 4/5b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 InsertByte- 00:09:41.962 [2024-07-24 13:22:00.789690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.962 [2024-07-24 13:22:00.789724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.962 [2024-07-24 13:22:00.789798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.962 [2024-07-24 13:22:00.789818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:41.962 #6 NEW cov: 11692 ft: 13127 corp: 5/7b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 InsertByte- 00:09:42.221 [2024-07-24 13:22:00.839788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.221 [2024-07-24 13:22:00.839822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.221 [2024-07-24 13:22:00.839891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.221 [2024-07-24 13:22:00.839911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.221 #7 NEW cov: 11692 ft: 13316 corp: 6/9b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 InsertByte- 00:09:42.221 [2024-07-24 13:22:00.889974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.221 [2024-07-24 13:22:00.890009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.221 [2024-07-24 13:22:00.890080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.221 [2024-07-24 13:22:00.890100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.221 #8 NEW cov: 11692 ft: 13450 corp: 7/11b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ChangeByte- 00:09:42.221 [2024-07-24 13:22:00.950190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.221 [2024-07-24 13:22:00.950231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.221 [2024-07-24 13:22:00.950303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.221 [2024-07-24 13:22:00.950324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.221 #9 NEW cov: 11692 ft: 13500 corp: 8/13b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ChangeBit- 00:09:42.221 [2024-07-24 13:22:01.000491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.221 [2024-07-24 13:22:01.000526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.221 [2024-07-24 13:22:01.000600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.221 [2024-07-24 13:22:01.000621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.221 [2024-07-24 13:22:01.000693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.221 [2024-07-24 13:22:01.000713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:42.221 #10 NEW cov: 11692 ft: 13714 corp: 9/16b lim: 5 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 CopyPart- 00:09:42.221 [2024-07-24 13:22:01.060486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.221 [2024-07-24 13:22:01.060521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.221 [2024-07-24 13:22:01.060598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.221 [2024-07-24 13:22:01.060620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.481 #11 NEW cov: 11692 ft: 13753 corp: 10/18b lim: 5 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 CrossOver- 00:09:42.481 [2024-07-24 13:22:01.120608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.481 [2024-07-24 13:22:01.120642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.481 [2024-07-24 13:22:01.120713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.481 [2024-07-24 13:22:01.120733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.481 #12 NEW cov: 11692 ft: 13787 corp: 11/20b lim: 5 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 CrossOver- 00:09:42.481 [2024-07-24 13:22:01.171124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.481 [2024-07-24 13:22:01.171160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.481 [2024-07-24 13:22:01.171229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.481 [2024-07-24 13:22:01.171251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.481 [2024-07-24 13:22:01.171317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.481 [2024-07-24 13:22:01.171338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:42.481 [2024-07-24 13:22:01.171405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.481 [2024-07-24 13:22:01.171426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:42.481 #13 NEW cov: 11692 ft: 14087 corp: 12/24b lim: 5 exec/s: 0 rss: 68Mb L: 4/4 MS: 1 CopyPart- 00:09:42.481 [2024-07-24 13:22:01.231045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.481 [2024-07-24 13:22:01.231083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.481 [2024-07-24 13:22:01.231161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.481 [2024-07-24 13:22:01.231182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.481 [2024-07-24 13:22:01.231257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.481 [2024-07-24 13:22:01.231279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:42.481 #14 NEW cov: 11692 ft: 14093 corp: 13/27b lim: 5 exec/s: 0 rss: 68Mb L: 3/4 MS: 1 InsertByte- 00:09:42.481 [2024-07-24 13:22:01.290865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.481 [2024-07-24 13:22:01.290907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.481 #15 NEW cov: 11692 ft: 14191 corp: 14/28b lim: 5 exec/s: 0 rss: 68Mb L: 1/4 MS: 1 CopyPart- 00:09:42.481 [2024-07-24 13:22:01.341646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.481 [2024-07-24 13:22:01.341681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.481 [2024-07-24 13:22:01.341755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.481 [2024-07-24 13:22:01.341775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.481 [2024-07-24 13:22:01.341844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.481 [2024-07-24 13:22:01.341864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:42.481 [2024-07-24 13:22:01.341931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.481 [2024-07-24 13:22:01.341952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:42.740 #16 NEW cov: 11692 ft: 14244 corp: 15/32b lim: 5 exec/s: 0 rss: 68Mb L: 4/4 MS: 1 ChangeBit- 00:09:42.740 [2024-07-24 13:22:01.401643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.740 [2024-07-24 13:22:01.401680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.740 [2024-07-24 13:22:01.401754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.740 [2024-07-24 13:22:01.401776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.740 [2024-07-24 13:22:01.401845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.740 [2024-07-24 13:22:01.401864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:42.740 #17 NEW cov: 11692 ft: 14258 corp: 16/35b lim: 5 exec/s: 0 rss: 68Mb L: 3/4 MS: 1 InsertByte- 00:09:42.740 [2024-07-24 13:22:01.451545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.740 [2024-07-24 13:22:01.451582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.740 [2024-07-24 13:22:01.451653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.740 [2024-07-24 13:22:01.451674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.998 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:42.998 #18 NEW cov: 11715 ft: 14339 corp: 17/37b lim: 5 exec/s: 18 rss: 69Mb L: 2/4 MS: 1 ChangeByte- 00:09:42.998 [2024-07-24 13:22:01.762858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.998 [2024-07-24 13:22:01.762914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.998 [2024-07-24 13:22:01.762994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.999 [2024-07-24 13:22:01.763019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.999 [2024-07-24 13:22:01.763091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.999 [2024-07-24 13:22:01.763114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:42.999 [2024-07-24 13:22:01.763186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.999 [2024-07-24 13:22:01.763209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:42.999 #19 NEW cov: 11715 ft: 14361 corp: 18/41b lim: 5 exec/s: 19 rss: 70Mb L: 4/4 MS: 1 ShuffleBytes- 00:09:42.999 [2024-07-24 13:22:01.822670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.999 [2024-07-24 13:22:01.822706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.999 [2024-07-24 13:22:01.822776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.999 [2024-07-24 13:22:01.822796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.999 [2024-07-24 13:22:01.822865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.999 [2024-07-24 13:22:01.822884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:43.259 #20 NEW cov: 11715 ft: 14375 corp: 19/44b lim: 5 exec/s: 20 rss: 70Mb L: 3/4 MS: 1 ChangeByte- 00:09:43.259 [2024-07-24 13:22:01.882505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.259 [2024-07-24 13:22:01.882539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.259 #21 NEW cov: 11715 ft: 14396 corp: 20/45b lim: 5 exec/s: 21 rss: 70Mb L: 1/4 MS: 1 CrossOver- 00:09:43.259 [2024-07-24 13:22:01.922792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.259 [2024-07-24 13:22:01.922826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.259 [2024-07-24 13:22:01.922895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.259 [2024-07-24 13:22:01.922916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:43.259 #22 NEW cov: 11715 ft: 14404 corp: 21/47b lim: 5 exec/s: 22 rss: 70Mb L: 2/4 MS: 1 CopyPart- 00:09:43.259 [2024-07-24 13:22:01.983104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.259 [2024-07-24 13:22:01.983138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.259 [2024-07-24 13:22:01.983217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.259 [2024-07-24 13:22:01.983241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:43.259 [2024-07-24 13:22:01.983310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.259 [2024-07-24 13:22:01.983329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:43.259 #23 NEW cov: 11715 ft: 14440 corp: 22/50b lim: 5 exec/s: 23 rss: 70Mb L: 3/4 MS: 1 ShuffleBytes- 00:09:43.259 [2024-07-24 13:22:02.033313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.259 [2024-07-24 13:22:02.033347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.259 [2024-07-24 13:22:02.033416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.259 [2024-07-24 13:22:02.033437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:43.259 [2024-07-24 13:22:02.033503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.259 [2024-07-24 13:22:02.033524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:43.259 #24 NEW cov: 11715 ft: 14516 corp: 23/53b lim: 5 exec/s: 24 rss: 70Mb L: 3/4 MS: 1 CrossOver- 00:09:43.259 [2024-07-24 13:22:02.093270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.259 [2024-07-24 13:22:02.093304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.259 [2024-07-24 13:22:02.093373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.259 [2024-07-24 13:22:02.093393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:43.518 #25 NEW cov: 11715 ft: 14538 corp: 24/55b lim: 5 exec/s: 25 rss: 70Mb L: 2/4 MS: 1 ShuffleBytes- 00:09:43.518 [2024-07-24 13:22:02.153456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.518 [2024-07-24 13:22:02.153489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.518 [2024-07-24 13:22:02.153556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.518 [2024-07-24 13:22:02.153576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:43.518 #26 NEW cov: 11715 ft: 14574 corp: 25/57b lim: 5 exec/s: 26 rss: 70Mb L: 2/4 MS: 1 InsertByte- 00:09:43.518 [2024-07-24 13:22:02.204069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.518 [2024-07-24 13:22:02.204106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.518 [2024-07-24 13:22:02.204174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.518 [2024-07-24 13:22:02.204197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:43.518 [2024-07-24 13:22:02.204275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.518 [2024-07-24 13:22:02.204294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:43.518 [2024-07-24 13:22:02.204362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.518 [2024-07-24 13:22:02.204382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:43.518 [2024-07-24 13:22:02.204447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.518 [2024-07-24 13:22:02.204467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:43.518 #27 NEW cov: 11715 ft: 14634 corp: 26/62b lim: 5 exec/s: 27 rss: 70Mb L: 5/5 MS: 1 CopyPart- 00:09:43.518 [2024-07-24 13:22:02.253748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.518 [2024-07-24 13:22:02.253783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.518 [2024-07-24 13:22:02.253850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.518 [2024-07-24 13:22:02.253870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:43.518 #28 NEW cov: 11715 ft: 14655 corp: 27/64b lim: 5 exec/s: 28 rss: 70Mb L: 2/5 MS: 1 ShuffleBytes- 00:09:43.518 [2024-07-24 13:22:02.303662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.519 [2024-07-24 13:22:02.303697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.519 #29 NEW cov: 11715 ft: 14683 corp: 28/65b lim: 5 exec/s: 29 rss: 70Mb L: 1/5 MS: 1 ChangeBit- 00:09:43.519 [2024-07-24 13:22:02.354197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.519 [2024-07-24 13:22:02.354238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.519 [2024-07-24 13:22:02.354307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.519 [2024-07-24 13:22:02.354327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:43.519 [2024-07-24 13:22:02.354395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.519 [2024-07-24 13:22:02.354415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:43.778 #30 NEW cov: 11715 ft: 14688 corp: 29/68b lim: 5 exec/s: 30 rss: 70Mb L: 3/5 MS: 1 EraseBytes- 00:09:43.778 [2024-07-24 13:22:02.404512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.778 [2024-07-24 13:22:02.404545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.778 [2024-07-24 13:22:02.404613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.778 [2024-07-24 13:22:02.404637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:43.778 [2024-07-24 13:22:02.404702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.778 [2024-07-24 13:22:02.404721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:43.778 [2024-07-24 13:22:02.404786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.778 [2024-07-24 13:22:02.404805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:43.778 #31 NEW cov: 11715 ft: 14716 corp: 30/72b lim: 5 exec/s: 31 rss: 70Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:09:43.779 [2024-07-24 13:22:02.454120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.779 [2024-07-24 13:22:02.454155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.779 #32 NEW cov: 11715 ft: 14729 corp: 31/73b lim: 5 exec/s: 32 rss: 70Mb L: 1/5 MS: 1 EraseBytes- 00:09:43.779 [2024-07-24 13:22:02.515070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.779 [2024-07-24 13:22:02.515106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.779 [2024-07-24 13:22:02.515179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.779 [2024-07-24 13:22:02.515200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:43.779 [2024-07-24 13:22:02.515272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.779 [2024-07-24 13:22:02.515292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:43.779 [2024-07-24 13:22:02.515361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.779 [2024-07-24 13:22:02.515381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:43.779 [2024-07-24 13:22:02.515450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.779 [2024-07-24 13:22:02.515471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:43.779 #33 NEW cov: 11715 ft: 14735 corp: 32/78b lim: 5 exec/s: 33 rss: 70Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:09:43.779 [2024-07-24 13:22:02.564842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.779 [2024-07-24 13:22:02.564876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.779 [2024-07-24 13:22:02.564945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.779 [2024-07-24 13:22:02.564965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:43.779 [2024-07-24 13:22:02.565029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.779 [2024-07-24 13:22:02.565054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:43.779 #34 NEW cov: 11715 ft: 14760 corp: 33/81b lim: 5 exec/s: 17 rss: 70Mb L: 3/5 MS: 1 CMP- DE: "\001\025"- 00:09:43.779 #34 DONE cov: 11715 ft: 14760 corp: 33/81b lim: 5 exec/s: 17 rss: 70Mb 00:09:43.779 ###### Recommended dictionary. ###### 00:09:43.779 "\001\025" # Uses: 0 00:09:43.779 ###### End of recommended dictionary. ###### 00:09:43.779 Done 34 runs in 2 second(s) 00:09:44.038 13:22:02 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:09:44.038 13:22:02 -- ../common.sh@72 -- # (( i++ )) 00:09:44.038 13:22:02 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:44.038 13:22:02 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:09:44.038 13:22:02 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:09:44.038 13:22:02 -- nvmf/run.sh@24 -- # local timen=1 00:09:44.038 13:22:02 -- nvmf/run.sh@25 -- # local core=0x1 00:09:44.038 13:22:02 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:09:44.038 13:22:02 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:09:44.038 13:22:02 -- nvmf/run.sh@29 -- # printf %02d 9 00:09:44.038 13:22:02 -- nvmf/run.sh@29 -- # port=4409 00:09:44.038 13:22:02 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:09:44.038 13:22:02 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:09:44.038 13:22:02 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:44.038 13:22:02 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:09:44.038 [2024-07-24 13:22:02.788185] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:44.038 [2024-07-24 13:22:02.788281] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3170374 ] 00:09:44.038 EAL: No free 2048 kB hugepages reported on node 1 00:09:44.297 [2024-07-24 13:22:03.050226] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:44.297 [2024-07-24 13:22:03.076536] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:44.297 [2024-07-24 13:22:03.076714] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:44.297 [2024-07-24 13:22:03.131939] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:44.297 [2024-07-24 13:22:03.148181] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:09:44.556 INFO: Running with entropic power schedule (0xFF, 100). 00:09:44.556 INFO: Seed: 2289542332 00:09:44.556 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:09:44.556 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:09:44.556 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:09:44.556 INFO: A corpus is not provided, starting from an empty corpus 00:09:44.556 [2024-07-24 13:22:03.213855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.556 [2024-07-24 13:22:03.213894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:44.556 #2 INITED cov: 11440 ft: 11489 corp: 1/1b exec/s: 0 rss: 67Mb 00:09:44.556 [2024-07-24 13:22:03.253785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.556 [2024-07-24 13:22:03.253825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.123 NEW_FUNC[1/2]: 0x1975690 in event_queue_run_batch /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:528 00:09:45.123 NEW_FUNC[2/2]: 0x197aa30 in _reactor_run /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:894 00:09:45.123 #3 NEW cov: 11601 ft: 11991 corp: 2/2b lim: 5 exec/s: 0 rss: 68Mb L: 1/1 MS: 1 CrossOver- 00:09:45.123 [2024-07-24 13:22:03.715433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.123 [2024-07-24 13:22:03.715494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.123 [2024-07-24 13:22:03.715580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.123 [2024-07-24 13:22:03.715608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.123 #4 NEW cov: 11607 ft: 12963 corp: 3/4b lim: 5 exec/s: 0 rss: 69Mb L: 2/2 MS: 1 CopyPart- 00:09:45.123 [2024-07-24 13:22:03.775196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.123 [2024-07-24 13:22:03.775237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.124 #5 NEW cov: 11692 ft: 13221 corp: 4/5b lim: 5 exec/s: 0 rss: 69Mb L: 1/2 MS: 1 ChangeByte- 00:09:45.124 [2024-07-24 13:22:03.825496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.124 [2024-07-24 13:22:03.825531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.124 [2024-07-24 13:22:03.825599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.124 [2024-07-24 13:22:03.825619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.124 #6 NEW cov: 11692 ft: 13311 corp: 5/7b lim: 5 exec/s: 0 rss: 69Mb L: 2/2 MS: 1 ChangeByte- 00:09:45.124 [2024-07-24 13:22:03.885498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.124 [2024-07-24 13:22:03.885533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.124 #7 NEW cov: 11692 ft: 13385 corp: 6/8b lim: 5 exec/s: 0 rss: 69Mb L: 1/2 MS: 1 ShuffleBytes- 00:09:45.124 [2024-07-24 13:22:03.925760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.124 [2024-07-24 13:22:03.925795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.124 [2024-07-24 13:22:03.925864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.124 [2024-07-24 13:22:03.925884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.124 #8 NEW cov: 11692 ft: 13471 corp: 7/10b lim: 5 exec/s: 0 rss: 69Mb L: 2/2 MS: 1 ShuffleBytes- 00:09:45.124 [2024-07-24 13:22:03.986192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.124 [2024-07-24 13:22:03.986233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.124 [2024-07-24 13:22:03.986312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.124 [2024-07-24 13:22:03.986335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.124 [2024-07-24 13:22:03.986404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.124 [2024-07-24 13:22:03.986425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:45.383 #9 NEW cov: 11692 ft: 13704 corp: 8/13b lim: 5 exec/s: 0 rss: 69Mb L: 3/3 MS: 1 InsertByte- 00:09:45.383 [2024-07-24 13:22:04.045961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.383 [2024-07-24 13:22:04.045996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.383 #10 NEW cov: 11692 ft: 13751 corp: 9/14b lim: 5 exec/s: 0 rss: 69Mb L: 1/3 MS: 1 ShuffleBytes- 00:09:45.383 [2024-07-24 13:22:04.096186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.383 [2024-07-24 13:22:04.096228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.383 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:45.383 #11 NEW cov: 11715 ft: 13869 corp: 10/15b lim: 5 exec/s: 0 rss: 69Mb L: 1/3 MS: 1 EraseBytes- 00:09:45.383 [2024-07-24 13:22:04.146436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.383 [2024-07-24 13:22:04.146472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.383 [2024-07-24 13:22:04.146542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.383 [2024-07-24 13:22:04.146562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.383 #12 NEW cov: 11715 ft: 13882 corp: 11/17b lim: 5 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 EraseBytes- 00:09:45.383 [2024-07-24 13:22:04.206471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.383 [2024-07-24 13:22:04.206506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.383 #13 NEW cov: 11715 ft: 13912 corp: 12/18b lim: 5 exec/s: 13 rss: 69Mb L: 1/3 MS: 1 ShuffleBytes- 00:09:45.642 [2024-07-24 13:22:04.266622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.642 [2024-07-24 13:22:04.266657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.642 #14 NEW cov: 11715 ft: 13923 corp: 13/19b lim: 5 exec/s: 14 rss: 69Mb L: 1/3 MS: 1 ShuffleBytes- 00:09:45.642 [2024-07-24 13:22:04.326986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.642 [2024-07-24 13:22:04.327020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.642 [2024-07-24 13:22:04.327088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.642 [2024-07-24 13:22:04.327112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.642 #15 NEW cov: 11715 ft: 13941 corp: 14/21b lim: 5 exec/s: 15 rss: 70Mb L: 2/3 MS: 1 ChangeByte- 00:09:45.642 [2024-07-24 13:22:04.377461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.642 [2024-07-24 13:22:04.377496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.642 [2024-07-24 13:22:04.377542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.642 [2024-07-24 13:22:04.377566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.642 [2024-07-24 13:22:04.377634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.642 [2024-07-24 13:22:04.377654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:45.642 [2024-07-24 13:22:04.377722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.642 [2024-07-24 13:22:04.377742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:45.642 #16 NEW cov: 11715 ft: 14236 corp: 15/25b lim: 5 exec/s: 16 rss: 70Mb L: 4/4 MS: 1 CopyPart- 00:09:45.642 [2024-07-24 13:22:04.427250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.642 [2024-07-24 13:22:04.427285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.642 [2024-07-24 13:22:04.427357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.642 [2024-07-24 13:22:04.427378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.642 #17 NEW cov: 11715 ft: 14240 corp: 16/27b lim: 5 exec/s: 17 rss: 70Mb L: 2/4 MS: 1 CrossOver- 00:09:45.642 [2024-07-24 13:22:04.477380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.642 [2024-07-24 13:22:04.477415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.642 [2024-07-24 13:22:04.477484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.642 [2024-07-24 13:22:04.477504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.901 #18 NEW cov: 11715 ft: 14283 corp: 17/29b lim: 5 exec/s: 18 rss: 70Mb L: 2/4 MS: 1 InsertByte- 00:09:45.901 [2024-07-24 13:22:04.537774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.901 [2024-07-24 13:22:04.537809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.901 [2024-07-24 13:22:04.537877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.901 [2024-07-24 13:22:04.537897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.901 [2024-07-24 13:22:04.537971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.901 [2024-07-24 13:22:04.537991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:45.901 #19 NEW cov: 11715 ft: 14300 corp: 18/32b lim: 5 exec/s: 19 rss: 70Mb L: 3/4 MS: 1 InsertByte- 00:09:45.902 [2024-07-24 13:22:04.597957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.902 [2024-07-24 13:22:04.597991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.902 [2024-07-24 13:22:04.598060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.902 [2024-07-24 13:22:04.598081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.902 [2024-07-24 13:22:04.598152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.902 [2024-07-24 13:22:04.598171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:45.902 #20 NEW cov: 11715 ft: 14309 corp: 19/35b lim: 5 exec/s: 20 rss: 70Mb L: 3/4 MS: 1 CopyPart- 00:09:45.902 [2024-07-24 13:22:04.658261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.902 [2024-07-24 13:22:04.658295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.902 [2024-07-24 13:22:04.658365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.902 [2024-07-24 13:22:04.658386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.902 [2024-07-24 13:22:04.658453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.902 [2024-07-24 13:22:04.658473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:45.902 [2024-07-24 13:22:04.658542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.902 [2024-07-24 13:22:04.658561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:45.902 #21 NEW cov: 11715 ft: 14328 corp: 20/39b lim: 5 exec/s: 21 rss: 70Mb L: 4/4 MS: 1 ShuffleBytes- 00:09:45.902 [2024-07-24 13:22:04.718252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.902 [2024-07-24 13:22:04.718285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.902 [2024-07-24 13:22:04.718355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.902 [2024-07-24 13:22:04.718375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.902 [2024-07-24 13:22:04.718443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.902 [2024-07-24 13:22:04.718463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:45.902 #22 NEW cov: 11715 ft: 14354 corp: 21/42b lim: 5 exec/s: 22 rss: 70Mb L: 3/4 MS: 1 ChangeBit- 00:09:46.161 [2024-07-24 13:22:04.778088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:46.161 [2024-07-24 13:22:04.778123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:46.161 #23 NEW cov: 11715 ft: 14367 corp: 22/43b lim: 5 exec/s: 23 rss: 70Mb L: 1/4 MS: 1 CopyPart- 00:09:46.161 [2024-07-24 13:22:04.818173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:46.161 [2024-07-24 13:22:04.818206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:46.161 #24 NEW cov: 11715 ft: 14370 corp: 23/44b lim: 5 exec/s: 24 rss: 70Mb L: 1/4 MS: 1 ChangeBit- 00:09:46.161 [2024-07-24 13:22:04.879125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:46.161 [2024-07-24 13:22:04.879160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:46.161 [2024-07-24 13:22:04.879227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:46.161 [2024-07-24 13:22:04.879248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:46.161 [2024-07-24 13:22:04.879315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:46.161 [2024-07-24 13:22:04.879335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:46.161 [2024-07-24 13:22:04.879401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:46.161 [2024-07-24 13:22:04.879420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:46.161 [2024-07-24 13:22:04.879486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:46.161 [2024-07-24 13:22:04.879506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:46.161 #25 NEW cov: 11715 ft: 14433 corp: 24/49b lim: 5 exec/s: 25 rss: 70Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:09:46.161 [2024-07-24 13:22:04.928569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:46.161 [2024-07-24 13:22:04.928604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:46.161 #26 NEW cov: 11715 ft: 14452 corp: 25/50b lim: 5 exec/s: 26 rss: 70Mb L: 1/5 MS: 1 ShuffleBytes- 00:09:46.161 [2024-07-24 13:22:04.968636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:46.161 [2024-07-24 13:22:04.968670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:46.161 #27 NEW cov: 11715 ft: 14491 corp: 26/51b lim: 5 exec/s: 27 rss: 70Mb L: 1/5 MS: 1 ChangeBit- 00:09:46.161 [2024-07-24 13:22:05.018819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:46.161 [2024-07-24 13:22:05.018853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:46.421 #28 NEW cov: 11715 ft: 14499 corp: 27/52b lim: 5 exec/s: 28 rss: 70Mb L: 1/5 MS: 1 ChangeByte- 00:09:46.421 [2024-07-24 13:22:05.059492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:46.421 [2024-07-24 13:22:05.059526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:46.421 [2024-07-24 13:22:05.059594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:46.421 [2024-07-24 13:22:05.059614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:46.421 [2024-07-24 13:22:05.059680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:46.421 [2024-07-24 13:22:05.059700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:46.421 [2024-07-24 13:22:05.059764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:46.421 [2024-07-24 13:22:05.059784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:46.421 #29 NEW cov: 11715 ft: 14516 corp: 28/56b lim: 5 exec/s: 29 rss: 70Mb L: 4/5 MS: 1 ChangeBinInt- 00:09:46.421 [2024-07-24 13:22:05.119302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:46.421 [2024-07-24 13:22:05.119336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:46.421 [2024-07-24 13:22:05.119403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:46.421 [2024-07-24 13:22:05.119423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:46.421 #30 NEW cov: 11715 ft: 14574 corp: 29/58b lim: 5 exec/s: 30 rss: 70Mb L: 2/5 MS: 1 CopyPart- 00:09:46.421 [2024-07-24 13:22:05.179295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:46.421 [2024-07-24 13:22:05.179329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:46.421 #31 NEW cov: 11715 ft: 14611 corp: 30/59b lim: 5 exec/s: 15 rss: 70Mb L: 1/5 MS: 1 EraseBytes- 00:09:46.421 #31 DONE cov: 11715 ft: 14611 corp: 30/59b lim: 5 exec/s: 15 rss: 70Mb 00:09:46.421 Done 31 runs in 2 second(s) 00:09:46.681 13:22:05 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:09:46.681 13:22:05 -- ../common.sh@72 -- # (( i++ )) 00:09:46.681 13:22:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:46.681 13:22:05 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:09:46.681 13:22:05 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:09:46.681 13:22:05 -- nvmf/run.sh@24 -- # local timen=1 00:09:46.681 13:22:05 -- nvmf/run.sh@25 -- # local core=0x1 00:09:46.681 13:22:05 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:09:46.681 13:22:05 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:09:46.681 13:22:05 -- nvmf/run.sh@29 -- # printf %02d 10 00:09:46.681 13:22:05 -- nvmf/run.sh@29 -- # port=4410 00:09:46.681 13:22:05 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:09:46.681 13:22:05 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:09:46.681 13:22:05 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:46.681 13:22:05 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:09:46.681 [2024-07-24 13:22:05.390590] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:46.681 [2024-07-24 13:22:05.390675] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3170668 ] 00:09:46.681 EAL: No free 2048 kB hugepages reported on node 1 00:09:46.940 [2024-07-24 13:22:05.646728] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:46.940 [2024-07-24 13:22:05.673386] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:46.940 [2024-07-24 13:22:05.673560] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:46.940 [2024-07-24 13:22:05.728084] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:46.940 [2024-07-24 13:22:05.744335] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:09:46.940 INFO: Running with entropic power schedule (0xFF, 100). 00:09:46.940 INFO: Seed: 591586155 00:09:46.940 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:09:46.940 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:09:46.940 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:09:46.940 INFO: A corpus is not provided, starting from an empty corpus 00:09:46.940 #2 INITED exec/s: 0 rss: 61Mb 00:09:46.940 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:46.940 This may also happen if the target rejected all inputs we tried so far 00:09:46.941 [2024-07-24 13:22:05.799599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:46.941 [2024-07-24 13:22:05.799648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:46.941 [2024-07-24 13:22:05.799702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:46.941 [2024-07-24 13:22:05.799728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:46.941 [2024-07-24 13:22:05.799776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:46.941 [2024-07-24 13:22:05.799801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:47.458 NEW_FUNC[1/669]: 0x4ab4e0 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:09:47.458 NEW_FUNC[2/669]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:47.458 #5 NEW cov: 11502 ft: 11512 corp: 2/26b lim: 40 exec/s: 0 rss: 68Mb L: 25/25 MS: 3 ChangeByte-CopyPart-InsertRepeatedBytes- 00:09:47.458 [2024-07-24 13:22:06.300920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.458 [2024-07-24 13:22:06.300979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.458 [2024-07-24 13:22:06.301034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.458 [2024-07-24 13:22:06.301059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.458 [2024-07-24 13:22:06.301119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.458 [2024-07-24 13:22:06.301143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:47.458 [2024-07-24 13:22:06.301192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.458 [2024-07-24 13:22:06.301223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:47.717 NEW_FUNC[1/1]: 0x1268340 in nvmf_poll_group_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/nvmf.c:153 00:09:47.717 #6 NEW cov: 11624 ft: 12329 corp: 3/64b lim: 40 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 CopyPart- 00:09:47.717 [2024-07-24 13:22:06.400781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.717 [2024-07-24 13:22:06.400829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.717 #10 NEW cov: 11630 ft: 13045 corp: 4/79b lim: 40 exec/s: 0 rss: 69Mb L: 15/38 MS: 4 CopyPart-ShuffleBytes-CopyPart-CrossOver- 00:09:47.717 [2024-07-24 13:22:06.481042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.717 [2024-07-24 13:22:06.481087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.717 [2024-07-24 13:22:06.481141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.717 [2024-07-24 13:22:06.481167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.717 #12 NEW cov: 11715 ft: 13420 corp: 5/102b lim: 40 exec/s: 0 rss: 69Mb L: 23/38 MS: 2 CrossOver-CrossOver- 00:09:47.717 [2024-07-24 13:22:06.551274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0b000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.717 [2024-07-24 13:22:06.551320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.717 [2024-07-24 13:22:06.551374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.717 [2024-07-24 13:22:06.551401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.976 #14 NEW cov: 11715 ft: 13546 corp: 6/124b lim: 40 exec/s: 0 rss: 69Mb L: 22/38 MS: 2 ChangeBit-CrossOver- 00:09:47.976 [2024-07-24 13:22:06.621506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.976 [2024-07-24 13:22:06.621551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.976 [2024-07-24 13:22:06.621605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:02000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.976 [2024-07-24 13:22:06.621630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.976 [2024-07-24 13:22:06.621679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.976 [2024-07-24 13:22:06.621703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:47.976 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:47.976 #15 NEW cov: 11732 ft: 13585 corp: 7/149b lim: 40 exec/s: 0 rss: 69Mb L: 25/38 MS: 1 ChangeBinInt- 00:09:47.976 [2024-07-24 13:22:06.701848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.976 [2024-07-24 13:22:06.701896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.976 [2024-07-24 13:22:06.701953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.976 [2024-07-24 13:22:06.701981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.976 [2024-07-24 13:22:06.702032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.976 [2024-07-24 13:22:06.702057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:47.976 [2024-07-24 13:22:06.702105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.976 [2024-07-24 13:22:06.702130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:47.976 #16 NEW cov: 11732 ft: 13692 corp: 8/187b lim: 40 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 ShuffleBytes- 00:09:47.976 [2024-07-24 13:22:06.802062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.976 [2024-07-24 13:22:06.802108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.976 [2024-07-24 13:22:06.802161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.976 [2024-07-24 13:22:06.802186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.976 [2024-07-24 13:22:06.802255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.976 [2024-07-24 13:22:06.802280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:47.976 [2024-07-24 13:22:06.802328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.976 [2024-07-24 13:22:06.802353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:48.301 #17 NEW cov: 11732 ft: 13718 corp: 9/225b lim: 40 exec/s: 17 rss: 69Mb L: 38/38 MS: 1 CrossOver- 00:09:48.301 [2024-07-24 13:22:06.902275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.301 [2024-07-24 13:22:06.902320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.301 [2024-07-24 13:22:06.902373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.301 [2024-07-24 13:22:06.902397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:48.301 [2024-07-24 13:22:06.902446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.301 [2024-07-24 13:22:06.902476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:48.301 #18 NEW cov: 11732 ft: 13886 corp: 10/249b lim: 40 exec/s: 18 rss: 69Mb L: 24/38 MS: 1 InsertByte- 00:09:48.301 [2024-07-24 13:22:06.953221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.301 [2024-07-24 13:22:06.953259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.301 [2024-07-24 13:22:06.953332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.301 [2024-07-24 13:22:06.953352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:48.301 [2024-07-24 13:22:06.953421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.301 [2024-07-24 13:22:06.953441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:48.301 [2024-07-24 13:22:06.953511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.301 [2024-07-24 13:22:06.953531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:48.301 #19 NEW cov: 11732 ft: 13984 corp: 11/287b lim: 40 exec/s: 19 rss: 69Mb L: 38/38 MS: 1 ShuffleBytes- 00:09:48.301 [2024-07-24 13:22:07.013202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3d000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.301 [2024-07-24 13:22:07.013247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.301 [2024-07-24 13:22:07.013324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.301 [2024-07-24 13:22:07.013344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:48.301 [2024-07-24 13:22:07.013415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.301 [2024-07-24 13:22:07.013435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:48.301 #20 NEW cov: 11732 ft: 14053 corp: 12/313b lim: 40 exec/s: 20 rss: 69Mb L: 26/38 MS: 1 InsertByte- 00:09:48.301 [2024-07-24 13:22:07.063362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.301 [2024-07-24 13:22:07.063397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.301 [2024-07-24 13:22:07.063471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.301 [2024-07-24 13:22:07.063491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:48.301 [2024-07-24 13:22:07.063560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.301 [2024-07-24 13:22:07.063580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:48.301 #21 NEW cov: 11732 ft: 14080 corp: 13/338b lim: 40 exec/s: 21 rss: 69Mb L: 25/38 MS: 1 CrossOver- 00:09:48.301 [2024-07-24 13:22:07.123400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0b0f0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.301 [2024-07-24 13:22:07.123436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.301 [2024-07-24 13:22:07.123509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.301 [2024-07-24 13:22:07.123531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:48.560 #22 NEW cov: 11732 ft: 14161 corp: 14/361b lim: 40 exec/s: 22 rss: 69Mb L: 23/38 MS: 1 InsertByte- 00:09:48.560 [2024-07-24 13:22:07.183683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.560 [2024-07-24 13:22:07.183719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.560 [2024-07-24 13:22:07.183790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.560 [2024-07-24 13:22:07.183811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:48.560 [2024-07-24 13:22:07.183881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.560 [2024-07-24 13:22:07.183901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:48.560 #23 NEW cov: 11732 ft: 14181 corp: 15/385b lim: 40 exec/s: 23 rss: 69Mb L: 24/38 MS: 1 CopyPart- 00:09:48.560 [2024-07-24 13:22:07.243566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.560 [2024-07-24 13:22:07.243601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.560 #24 NEW cov: 11732 ft: 14218 corp: 16/398b lim: 40 exec/s: 24 rss: 70Mb L: 13/38 MS: 1 EraseBytes- 00:09:48.561 [2024-07-24 13:22:07.303994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.561 [2024-07-24 13:22:07.304028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.561 [2024-07-24 13:22:07.304101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:02000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.561 [2024-07-24 13:22:07.304120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:48.561 [2024-07-24 13:22:07.304188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.561 [2024-07-24 13:22:07.304208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:48.561 #25 NEW cov: 11732 ft: 14274 corp: 17/422b lim: 40 exec/s: 25 rss: 70Mb L: 24/38 MS: 1 CrossOver- 00:09:48.561 [2024-07-24 13:22:07.353852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:1f8ea406 cdw11:99e72d00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.561 [2024-07-24 13:22:07.353886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.561 #28 NEW cov: 11732 ft: 14284 corp: 18/433b lim: 40 exec/s: 28 rss: 70Mb L: 11/38 MS: 3 CMP-ChangeByte-CMP- DE: "\005\000"-"\037\216\244\006\231\347-\000"- 00:09:48.561 [2024-07-24 13:22:07.404132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.561 [2024-07-24 13:22:07.404166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.561 [2024-07-24 13:22:07.404245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.561 [2024-07-24 13:22:07.404265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:48.820 #29 NEW cov: 11732 ft: 14304 corp: 19/451b lim: 40 exec/s: 29 rss: 70Mb L: 18/38 MS: 1 CrossOver- 00:09:48.820 [2024-07-24 13:22:07.444514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.820 [2024-07-24 13:22:07.444548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.820 [2024-07-24 13:22:07.444621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.820 [2024-07-24 13:22:07.444642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:48.820 [2024-07-24 13:22:07.444713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.820 [2024-07-24 13:22:07.444733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:48.820 [2024-07-24 13:22:07.444805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.820 [2024-07-24 13:22:07.444825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:48.820 #30 NEW cov: 11732 ft: 14327 corp: 20/489b lim: 40 exec/s: 30 rss: 70Mb L: 38/38 MS: 1 ChangeBit- 00:09:48.820 [2024-07-24 13:22:07.494279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:1f8ea406 cdw11:e70a002d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.820 [2024-07-24 13:22:07.494313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.820 #31 NEW cov: 11732 ft: 14342 corp: 21/500b lim: 40 exec/s: 31 rss: 70Mb L: 11/38 MS: 1 ShuffleBytes- 00:09:48.820 [2024-07-24 13:22:07.554776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.820 [2024-07-24 13:22:07.554810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.820 [2024-07-24 13:22:07.554886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.820 [2024-07-24 13:22:07.554907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:48.820 [2024-07-24 13:22:07.554979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0000004a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.820 [2024-07-24 13:22:07.554999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:48.820 #32 NEW cov: 11732 ft: 14360 corp: 22/524b lim: 40 exec/s: 32 rss: 70Mb L: 24/38 MS: 1 ChangeByte- 00:09:48.820 [2024-07-24 13:22:07.614929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0000001f cdw11:8ea40699 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.820 [2024-07-24 13:22:07.614968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.820 [2024-07-24 13:22:07.615045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e72d0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.821 [2024-07-24 13:22:07.615067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:48.821 [2024-07-24 13:22:07.615141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.821 [2024-07-24 13:22:07.615161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:48.821 #33 NEW cov: 11732 ft: 14387 corp: 23/549b lim: 40 exec/s: 33 rss: 70Mb L: 25/38 MS: 1 PersAutoDict- DE: "\037\216\244\006\231\347-\000"- 00:09:48.821 [2024-07-24 13:22:07.664746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.821 [2024-07-24 13:22:07.664781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:49.080 #34 NEW cov: 11739 ft: 14447 corp: 24/564b lim: 40 exec/s: 34 rss: 70Mb L: 15/38 MS: 1 EraseBytes- 00:09:49.080 [2024-07-24 13:22:07.715061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0b0f0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:49.080 [2024-07-24 13:22:07.715096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:49.080 [2024-07-24 13:22:07.715168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:6f000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:49.080 [2024-07-24 13:22:07.715188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:49.080 #35 NEW cov: 11739 ft: 14462 corp: 25/587b lim: 40 exec/s: 35 rss: 70Mb L: 23/38 MS: 1 ChangeByte- 00:09:49.080 [2024-07-24 13:22:07.775522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3d3d3d3d cdw11:3d3d3d3d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:49.080 [2024-07-24 13:22:07.775556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:49.080 [2024-07-24 13:22:07.775626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3d3d3d3d cdw11:3d3d3d3d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:49.080 [2024-07-24 13:22:07.775647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:49.080 [2024-07-24 13:22:07.775717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:3d3d3d3d cdw11:3d3d3d3d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:49.080 [2024-07-24 13:22:07.775738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:49.080 [2024-07-24 13:22:07.775810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:3d3d3d3d cdw11:3d3d3d3d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:49.080 [2024-07-24 13:22:07.775830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:49.080 #36 NEW cov: 11739 ft: 14486 corp: 26/620b lim: 40 exec/s: 18 rss: 70Mb L: 33/38 MS: 1 InsertRepeatedBytes- 00:09:49.080 #36 DONE cov: 11739 ft: 14486 corp: 26/620b lim: 40 exec/s: 18 rss: 70Mb 00:09:49.080 ###### Recommended dictionary. ###### 00:09:49.080 "\005\000" # Uses: 0 00:09:49.080 "\037\216\244\006\231\347-\000" # Uses: 1 00:09:49.080 ###### End of recommended dictionary. ###### 00:09:49.080 Done 36 runs in 2 second(s) 00:09:49.080 13:22:07 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:09:49.339 13:22:07 -- ../common.sh@72 -- # (( i++ )) 00:09:49.339 13:22:07 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:49.339 13:22:07 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:09:49.339 13:22:07 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:09:49.339 13:22:07 -- nvmf/run.sh@24 -- # local timen=1 00:09:49.339 13:22:07 -- nvmf/run.sh@25 -- # local core=0x1 00:09:49.339 13:22:07 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:09:49.339 13:22:07 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:09:49.339 13:22:07 -- nvmf/run.sh@29 -- # printf %02d 11 00:09:49.339 13:22:07 -- nvmf/run.sh@29 -- # port=4411 00:09:49.339 13:22:07 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:09:49.339 13:22:07 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:09:49.339 13:22:07 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:49.339 13:22:07 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:09:49.339 [2024-07-24 13:22:07.989522] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:49.339 [2024-07-24 13:22:07.989615] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3171023 ] 00:09:49.339 EAL: No free 2048 kB hugepages reported on node 1 00:09:49.597 [2024-07-24 13:22:08.337956] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:49.597 [2024-07-24 13:22:08.370824] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:49.597 [2024-07-24 13:22:08.371003] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:49.597 [2024-07-24 13:22:08.425683] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:49.597 [2024-07-24 13:22:08.441942] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:09:49.597 INFO: Running with entropic power schedule (0xFF, 100). 00:09:49.597 INFO: Seed: 3287572245 00:09:49.856 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:09:49.856 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:09:49.856 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:09:49.856 INFO: A corpus is not provided, starting from an empty corpus 00:09:49.856 #2 INITED exec/s: 0 rss: 61Mb 00:09:49.856 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:49.856 This may also happen if the target rejected all inputs we tried so far 00:09:49.856 [2024-07-24 13:22:08.508161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.856 [2024-07-24 13:22:08.508202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:49.856 [2024-07-24 13:22:08.508283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.856 [2024-07-24 13:22:08.508303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:49.856 [2024-07-24 13:22:08.508379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.856 [2024-07-24 13:22:08.508399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:49.856 [2024-07-24 13:22:08.508471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.856 [2024-07-24 13:22:08.508495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:50.116 NEW_FUNC[1/671]: 0x4ad250 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:09:50.116 NEW_FUNC[2/671]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:50.116 #4 NEW cov: 11523 ft: 11524 corp: 2/38b lim: 40 exec/s: 0 rss: 68Mb L: 37/37 MS: 2 ChangeBit-InsertRepeatedBytes- 00:09:50.116 [2024-07-24 13:22:08.979517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.116 [2024-07-24 13:22:08.979574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.116 [2024-07-24 13:22:08.979657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.116 [2024-07-24 13:22:08.979682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.116 [2024-07-24 13:22:08.979761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.116 [2024-07-24 13:22:08.979784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.116 [2024-07-24 13:22:08.979866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4c4c4d4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.116 [2024-07-24 13:22:08.979894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:50.376 #5 NEW cov: 11636 ft: 12095 corp: 3/75b lim: 40 exec/s: 0 rss: 68Mb L: 37/37 MS: 1 ChangeBit- 00:09:50.376 [2024-07-24 13:22:09.039089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.376 [2024-07-24 13:22:09.039125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.376 [2024-07-24 13:22:09.039200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4d4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.376 [2024-07-24 13:22:09.039225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.376 #6 NEW cov: 11642 ft: 12580 corp: 4/94b lim: 40 exec/s: 0 rss: 69Mb L: 19/37 MS: 1 EraseBytes- 00:09:50.376 [2024-07-24 13:22:09.099547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.376 [2024-07-24 13:22:09.099582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.376 [2024-07-24 13:22:09.099653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.376 [2024-07-24 13:22:09.099673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.376 [2024-07-24 13:22:09.099742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4c4c4c3b cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.376 [2024-07-24 13:22:09.099762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.376 [2024-07-24 13:22:09.099834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.376 [2024-07-24 13:22:09.099858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:50.376 #7 NEW cov: 11727 ft: 12799 corp: 5/132b lim: 40 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 InsertByte- 00:09:50.376 [2024-07-24 13:22:09.149741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c0c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.376 [2024-07-24 13:22:09.149775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.376 [2024-07-24 13:22:09.149849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.376 [2024-07-24 13:22:09.149869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.376 [2024-07-24 13:22:09.149945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.376 [2024-07-24 13:22:09.149965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.376 [2024-07-24 13:22:09.150039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.376 [2024-07-24 13:22:09.150060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:50.376 #8 NEW cov: 11727 ft: 12898 corp: 6/169b lim: 40 exec/s: 0 rss: 69Mb L: 37/38 MS: 1 ChangeBit- 00:09:50.376 [2024-07-24 13:22:09.199883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c595959 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.376 [2024-07-24 13:22:09.199917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.376 [2024-07-24 13:22:09.199993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:59595959 cdw11:59595959 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.376 [2024-07-24 13:22:09.200013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.376 [2024-07-24 13:22:09.200083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:59595959 cdw11:4c4c4c4d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.376 [2024-07-24 13:22:09.200103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.376 [2024-07-24 13:22:09.200173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.376 [2024-07-24 13:22:09.200193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:50.376 #9 NEW cov: 11727 ft: 12960 corp: 7/203b lim: 40 exec/s: 0 rss: 69Mb L: 34/38 MS: 1 InsertRepeatedBytes- 00:09:50.638 [2024-07-24 13:22:09.260111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.638 [2024-07-24 13:22:09.260146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.638 [2024-07-24 13:22:09.260216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4c4c7e4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.638 [2024-07-24 13:22:09.260236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.638 [2024-07-24 13:22:09.260309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.638 [2024-07-24 13:22:09.260336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.638 [2024-07-24 13:22:09.260410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4c4c4d4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.638 [2024-07-24 13:22:09.260430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:50.638 #10 NEW cov: 11727 ft: 13032 corp: 8/240b lim: 40 exec/s: 0 rss: 69Mb L: 37/38 MS: 1 ChangeByte- 00:09:50.638 [2024-07-24 13:22:09.310219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.638 [2024-07-24 13:22:09.310254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.638 [2024-07-24 13:22:09.310323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.638 [2024-07-24 13:22:09.310343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.638 [2024-07-24 13:22:09.310416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4c4c4c3b cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.638 [2024-07-24 13:22:09.310436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.638 [2024-07-24 13:22:09.310508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.638 [2024-07-24 13:22:09.310528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:50.638 #11 NEW cov: 11727 ft: 13086 corp: 9/278b lim: 40 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 ChangeBinInt- 00:09:50.638 [2024-07-24 13:22:09.370411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.638 [2024-07-24 13:22:09.370448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.638 [2024-07-24 13:22:09.370525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.638 [2024-07-24 13:22:09.370545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.638 [2024-07-24 13:22:09.370621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4c4c4c3b cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.638 [2024-07-24 13:22:09.370641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.639 [2024-07-24 13:22:09.370716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4c4c4c4c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.639 [2024-07-24 13:22:09.370737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:50.639 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:50.639 #12 NEW cov: 11750 ft: 13164 corp: 10/316b lim: 40 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:09:50.639 [2024-07-24 13:22:09.430629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.639 [2024-07-24 13:22:09.430664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.639 [2024-07-24 13:22:09.430741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4c3b4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.639 [2024-07-24 13:22:09.430760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.639 [2024-07-24 13:22:09.430832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.639 [2024-07-24 13:22:09.430852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.639 [2024-07-24 13:22:09.430925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.639 [2024-07-24 13:22:09.430945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:50.639 #13 NEW cov: 11750 ft: 13198 corp: 11/354b lim: 40 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 CopyPart- 00:09:50.639 [2024-07-24 13:22:09.480771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.639 [2024-07-24 13:22:09.480806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.639 [2024-07-24 13:22:09.480878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4c4c4c4c cdw11:4c4c0f00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.639 [2024-07-24 13:22:09.480898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.639 [2024-07-24 13:22:09.480970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00004c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.639 [2024-07-24 13:22:09.480990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.639 [2024-07-24 13:22:09.481063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4c4c4c4c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.639 [2024-07-24 13:22:09.481083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:50.900 #14 NEW cov: 11750 ft: 13220 corp: 12/392b lim: 40 exec/s: 14 rss: 69Mb L: 38/38 MS: 1 CMP- DE: "\017\000\000\000\000\000\000\000"- 00:09:50.900 [2024-07-24 13:22:09.540782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a969696 cdw11:96969696 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.900 [2024-07-24 13:22:09.540817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.900 [2024-07-24 13:22:09.540890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:96969696 cdw11:96969696 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.900 [2024-07-24 13:22:09.540909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.900 [2024-07-24 13:22:09.540984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:96969696 cdw11:96969696 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.900 [2024-07-24 13:22:09.541003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.900 #15 NEW cov: 11750 ft: 13444 corp: 13/418b lim: 40 exec/s: 15 rss: 69Mb L: 26/38 MS: 1 InsertRepeatedBytes- 00:09:50.900 [2024-07-24 13:22:09.591056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c595959 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.900 [2024-07-24 13:22:09.591091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.900 [2024-07-24 13:22:09.591169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:59595959 cdw11:59595959 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.900 [2024-07-24 13:22:09.591188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.901 [2024-07-24 13:22:09.591259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:59595959 cdw11:4c4c4c4d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.901 [2024-07-24 13:22:09.591278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.901 [2024-07-24 13:22:09.591349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4c4c5959 cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.901 [2024-07-24 13:22:09.591370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:50.901 #16 NEW cov: 11750 ft: 13498 corp: 14/454b lim: 40 exec/s: 16 rss: 69Mb L: 36/38 MS: 1 CopyPart- 00:09:50.901 [2024-07-24 13:22:09.651205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.901 [2024-07-24 13:22:09.651247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.901 [2024-07-24 13:22:09.651322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4c4c4c4c cdw11:4c3b4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.901 [2024-07-24 13:22:09.651343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.901 [2024-07-24 13:22:09.651421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4c4c4c4c cdw11:4c4c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.901 [2024-07-24 13:22:09.651441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.901 [2024-07-24 13:22:09.651513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00004c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.901 [2024-07-24 13:22:09.651534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:50.901 #17 NEW cov: 11750 ft: 13514 corp: 15/486b lim: 40 exec/s: 17 rss: 70Mb L: 32/38 MS: 1 EraseBytes- 00:09:50.901 [2024-07-24 13:22:09.701179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a969696 cdw11:96969696 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.901 [2024-07-24 13:22:09.701220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.901 [2024-07-24 13:22:09.701297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:969e9696 cdw11:96969696 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.901 [2024-07-24 13:22:09.701316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.901 [2024-07-24 13:22:09.701391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:96969696 cdw11:96969696 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.901 [2024-07-24 13:22:09.701411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.901 #18 NEW cov: 11750 ft: 13521 corp: 16/512b lim: 40 exec/s: 18 rss: 70Mb L: 26/38 MS: 1 ChangeByte- 00:09:50.901 [2024-07-24 13:22:09.761549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.901 [2024-07-24 13:22:09.761584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.901 [2024-07-24 13:22:09.761663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4c4c7e4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.901 [2024-07-24 13:22:09.761683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.901 [2024-07-24 13:22:09.761754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.901 [2024-07-24 13:22:09.761774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.901 [2024-07-24 13:22:09.761846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4c4c4d4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.901 [2024-07-24 13:22:09.761866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:51.160 #19 NEW cov: 11750 ft: 13531 corp: 17/549b lim: 40 exec/s: 19 rss: 70Mb L: 37/38 MS: 1 CrossOver- 00:09:51.160 [2024-07-24 13:22:09.821986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c0c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.160 [2024-07-24 13:22:09.822023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:51.160 [2024-07-24 13:22:09.822096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.160 [2024-07-24 13:22:09.822117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:51.160 [2024-07-24 13:22:09.822191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.160 [2024-07-24 13:22:09.822216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:51.160 [2024-07-24 13:22:09.822293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.160 [2024-07-24 13:22:09.822314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:51.160 [2024-07-24 13:22:09.822388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.160 [2024-07-24 13:22:09.822408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:51.160 #20 NEW cov: 11750 ft: 13629 corp: 18/589b lim: 40 exec/s: 20 rss: 70Mb L: 40/40 MS: 1 CopyPart- 00:09:51.160 [2024-07-24 13:22:09.881946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:354c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.160 [2024-07-24 13:22:09.881981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:51.160 [2024-07-24 13:22:09.882058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4c4c7e4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.160 [2024-07-24 13:22:09.882078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:51.161 [2024-07-24 13:22:09.882149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.161 [2024-07-24 13:22:09.882169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:51.161 [2024-07-24 13:22:09.882244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4c4c4d4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.161 [2024-07-24 13:22:09.882265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:51.161 #26 NEW cov: 11750 ft: 13648 corp: 19/626b lim: 40 exec/s: 26 rss: 70Mb L: 37/40 MS: 1 ChangeByte- 00:09:51.161 [2024-07-24 13:22:09.931844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c4c4c0f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.161 [2024-07-24 13:22:09.931880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:51.161 [2024-07-24 13:22:09.931953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000004c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.161 [2024-07-24 13:22:09.931972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:51.161 [2024-07-24 13:22:09.932048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4c4c4c4c cdw11:4c000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.161 [2024-07-24 13:22:09.932068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:51.161 #27 NEW cov: 11750 ft: 13731 corp: 20/657b lim: 40 exec/s: 27 rss: 70Mb L: 31/40 MS: 1 EraseBytes- 00:09:51.161 [2024-07-24 13:22:09.992234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c595959 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.161 [2024-07-24 13:22:09.992269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:51.161 [2024-07-24 13:22:09.992345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:59595959 cdw11:59595959 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.161 [2024-07-24 13:22:09.992365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:51.161 [2024-07-24 13:22:09.992437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:59595959 cdw11:4c4c4c4d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.161 [2024-07-24 13:22:09.992457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:51.161 [2024-07-24 13:22:09.992526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.161 [2024-07-24 13:22:09.992545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:51.161 #33 NEW cov: 11750 ft: 13761 corp: 21/691b lim: 40 exec/s: 33 rss: 70Mb L: 34/40 MS: 1 ShuffleBytes- 00:09:51.420 [2024-07-24 13:22:10.043036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.420 [2024-07-24 13:22:10.043092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:51.420 [2024-07-24 13:22:10.043184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4c3b4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.420 [2024-07-24 13:22:10.043220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:51.420 [2024-07-24 13:22:10.043306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.420 [2024-07-24 13:22:10.043333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:51.420 [2024-07-24 13:22:10.043426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.420 [2024-07-24 13:22:10.043454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:51.420 [2024-07-24 13:22:10.043540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.420 [2024-07-24 13:22:10.043568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:51.420 #34 NEW cov: 11750 ft: 13916 corp: 22/731b lim: 40 exec/s: 34 rss: 70Mb L: 40/40 MS: 1 CopyPart- 00:09:51.420 [2024-07-24 13:22:10.112606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.420 [2024-07-24 13:22:10.112647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:51.420 [2024-07-24 13:22:10.112722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.420 [2024-07-24 13:22:10.112742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:51.420 [2024-07-24 13:22:10.112814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4c4c4c3b cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.420 [2024-07-24 13:22:10.112833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:51.420 [2024-07-24 13:22:10.112905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.420 [2024-07-24 13:22:10.112925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:51.420 #35 NEW cov: 11750 ft: 13993 corp: 23/769b lim: 40 exec/s: 35 rss: 70Mb L: 38/40 MS: 1 ShuffleBytes- 00:09:51.420 [2024-07-24 13:22:10.162708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.420 [2024-07-24 13:22:10.162744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:51.420 [2024-07-24 13:22:10.162820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4c4c4c4c cdw11:4c0a4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.421 [2024-07-24 13:22:10.162840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:51.421 [2024-07-24 13:22:10.162912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4c4c4c3b cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.421 [2024-07-24 13:22:10.162932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:51.421 [2024-07-24 13:22:10.163001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.421 [2024-07-24 13:22:10.163020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:51.421 #36 NEW cov: 11750 ft: 14007 corp: 24/807b lim: 40 exec/s: 36 rss: 70Mb L: 38/40 MS: 1 CrossOver- 00:09:51.421 [2024-07-24 13:22:10.222891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.421 [2024-07-24 13:22:10.222925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:51.421 [2024-07-24 13:22:10.223004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4cff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.421 [2024-07-24 13:22:10.223024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:51.421 [2024-07-24 13:22:10.223095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2ce79a8f cdw11:0b007c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.421 [2024-07-24 13:22:10.223114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:51.421 [2024-07-24 13:22:10.223186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.421 [2024-07-24 13:22:10.223206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:51.421 #37 NEW cov: 11750 ft: 14047 corp: 25/845b lim: 40 exec/s: 37 rss: 70Mb L: 38/40 MS: 1 CMP- DE: "\377,\347\232\217\013\000|"- 00:09:51.421 [2024-07-24 13:22:10.273001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.421 [2024-07-24 13:22:10.273035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:51.421 [2024-07-24 13:22:10.273112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4cff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.421 [2024-07-24 13:22:10.273132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:51.421 [2024-07-24 13:22:10.273203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2ce79a8f cdw11:0b007c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.421 [2024-07-24 13:22:10.273227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:51.421 [2024-07-24 13:22:10.273297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.421 [2024-07-24 13:22:10.273317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:51.680 #38 NEW cov: 11750 ft: 14087 corp: 26/883b lim: 40 exec/s: 38 rss: 70Mb L: 38/40 MS: 1 CopyPart- 00:09:51.680 [2024-07-24 13:22:10.333260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.680 [2024-07-24 13:22:10.333294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:51.680 [2024-07-24 13:22:10.333366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4c4c4c4c cdw11:4c0aafb3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.680 [2024-07-24 13:22:10.333386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:51.680 [2024-07-24 13:22:10.333457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4c4c4c3b cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.680 [2024-07-24 13:22:10.333476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:51.680 [2024-07-24 13:22:10.333551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.680 [2024-07-24 13:22:10.333571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:51.680 #39 NEW cov: 11750 ft: 14101 corp: 27/921b lim: 40 exec/s: 39 rss: 70Mb L: 38/40 MS: 1 ChangeBinInt- 00:09:51.680 [2024-07-24 13:22:10.393199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a969696 cdw11:96969696 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.680 [2024-07-24 13:22:10.393238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:51.680 [2024-07-24 13:22:10.393309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:969e9696 cdw11:96969696 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.680 [2024-07-24 13:22:10.393329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:51.680 [2024-07-24 13:22:10.393399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:96969696 cdw11:9696962f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.680 [2024-07-24 13:22:10.393419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:51.680 #40 NEW cov: 11750 ft: 14131 corp: 28/947b lim: 40 exec/s: 40 rss: 70Mb L: 26/40 MS: 1 ChangeByte- 00:09:51.680 [2024-07-24 13:22:10.453410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.680 [2024-07-24 13:22:10.453444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:51.680 [2024-07-24 13:22:10.453520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4d4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.680 [2024-07-24 13:22:10.453540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:51.680 [2024-07-24 13:22:10.453613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4c4c4c4c cdw11:4c4c4c4c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.680 [2024-07-24 13:22:10.453633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:51.680 #41 NEW cov: 11750 ft: 14146 corp: 29/972b lim: 40 exec/s: 20 rss: 70Mb L: 25/40 MS: 1 EraseBytes- 00:09:51.680 #41 DONE cov: 11750 ft: 14146 corp: 29/972b lim: 40 exec/s: 20 rss: 70Mb 00:09:51.680 ###### Recommended dictionary. ###### 00:09:51.680 "\000\000\000\000\000\000\000\000" # Uses: 0 00:09:51.680 "\017\000\000\000\000\000\000\000" # Uses: 0 00:09:51.680 "\377,\347\232\217\013\000|" # Uses: 0 00:09:51.680 ###### End of recommended dictionary. ###### 00:09:51.680 Done 41 runs in 2 second(s) 00:09:51.940 13:22:10 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:09:51.940 13:22:10 -- ../common.sh@72 -- # (( i++ )) 00:09:51.940 13:22:10 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:51.940 13:22:10 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:09:51.940 13:22:10 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:09:51.940 13:22:10 -- nvmf/run.sh@24 -- # local timen=1 00:09:51.940 13:22:10 -- nvmf/run.sh@25 -- # local core=0x1 00:09:51.940 13:22:10 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:09:51.940 13:22:10 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:09:51.940 13:22:10 -- nvmf/run.sh@29 -- # printf %02d 12 00:09:51.940 13:22:10 -- nvmf/run.sh@29 -- # port=4412 00:09:51.940 13:22:10 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:09:51.940 13:22:10 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:09:51.940 13:22:10 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:51.940 13:22:10 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:09:51.940 [2024-07-24 13:22:10.676944] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:51.940 [2024-07-24 13:22:10.677022] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3171385 ] 00:09:51.940 EAL: No free 2048 kB hugepages reported on node 1 00:09:52.199 [2024-07-24 13:22:11.020162] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:52.199 [2024-07-24 13:22:11.051245] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:52.199 [2024-07-24 13:22:11.051424] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:52.458 [2024-07-24 13:22:11.105980] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:52.458 [2024-07-24 13:22:11.122206] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:09:52.458 INFO: Running with entropic power schedule (0xFF, 100). 00:09:52.458 INFO: Seed: 1674601740 00:09:52.458 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:09:52.458 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:09:52.458 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:09:52.458 INFO: A corpus is not provided, starting from an empty corpus 00:09:52.458 #2 INITED exec/s: 0 rss: 61Mb 00:09:52.458 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:52.458 This may also happen if the target rejected all inputs we tried so far 00:09:52.458 [2024-07-24 13:22:11.177677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.458 [2024-07-24 13:22:11.177722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:52.458 [2024-07-24 13:22:11.177773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.458 [2024-07-24 13:22:11.177798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:52.458 [2024-07-24 13:22:11.177845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.458 [2024-07-24 13:22:11.177868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:52.458 [2024-07-24 13:22:11.177914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.458 [2024-07-24 13:22:11.177937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:53.025 NEW_FUNC[1/671]: 0x4aefc0 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:09:53.025 NEW_FUNC[2/671]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:53.025 #18 NEW cov: 11521 ft: 11522 corp: 2/39b lim: 40 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:09:53.025 [2024-07-24 13:22:11.678864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.025 [2024-07-24 13:22:11.678923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.025 [2024-07-24 13:22:11.678975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.025 [2024-07-24 13:22:11.679000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.025 [2024-07-24 13:22:11.679051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.025 [2024-07-24 13:22:11.679075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:53.025 [2024-07-24 13:22:11.679120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:78787878 cdw11:40787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.025 [2024-07-24 13:22:11.679144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:53.025 #19 NEW cov: 11634 ft: 12046 corp: 3/77b lim: 40 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 ChangeByte- 00:09:53.025 [2024-07-24 13:22:11.779007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.025 [2024-07-24 13:22:11.779052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.025 [2024-07-24 13:22:11.779104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.025 [2024-07-24 13:22:11.779128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.025 [2024-07-24 13:22:11.779173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.025 [2024-07-24 13:22:11.779197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:53.026 [2024-07-24 13:22:11.779250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.026 [2024-07-24 13:22:11.779275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:53.026 #20 NEW cov: 11640 ft: 12191 corp: 4/115b lim: 40 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 ShuffleBytes- 00:09:53.026 [2024-07-24 13:22:11.849155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.026 [2024-07-24 13:22:11.849197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.026 [2024-07-24 13:22:11.849255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.026 [2024-07-24 13:22:11.849281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.026 [2024-07-24 13:22:11.849326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.026 [2024-07-24 13:22:11.849350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:53.026 [2024-07-24 13:22:11.849394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.026 [2024-07-24 13:22:11.849419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:53.284 #21 NEW cov: 11725 ft: 12530 corp: 5/153b lim: 40 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 ShuffleBytes- 00:09:53.285 [2024-07-24 13:22:11.949485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.285 [2024-07-24 13:22:11.949526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.285 [2024-07-24 13:22:11.949584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.285 [2024-07-24 13:22:11.949608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.285 [2024-07-24 13:22:11.949654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.285 [2024-07-24 13:22:11.949678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:53.285 [2024-07-24 13:22:11.949724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.285 [2024-07-24 13:22:11.949748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:53.285 #27 NEW cov: 11725 ft: 12591 corp: 6/191b lim: 40 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 CrossOver- 00:09:53.285 [2024-07-24 13:22:12.039660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.285 [2024-07-24 13:22:12.039703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.285 [2024-07-24 13:22:12.039754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.285 [2024-07-24 13:22:12.039778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.285 [2024-07-24 13:22:12.039823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:78787878 cdw11:78788378 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.285 [2024-07-24 13:22:12.039846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:53.285 [2024-07-24 13:22:12.039891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.285 [2024-07-24 13:22:12.039915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:53.285 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:53.285 #28 NEW cov: 11742 ft: 12704 corp: 7/229b lim: 40 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 ChangeBinInt- 00:09:53.285 [2024-07-24 13:22:12.139993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.285 [2024-07-24 13:22:12.140037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.285 [2024-07-24 13:22:12.140087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.285 [2024-07-24 13:22:12.140112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.285 [2024-07-24 13:22:12.140158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.285 [2024-07-24 13:22:12.140181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:53.285 [2024-07-24 13:22:12.140236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.285 [2024-07-24 13:22:12.140260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:53.544 #29 NEW cov: 11742 ft: 12790 corp: 8/267b lim: 40 exec/s: 29 rss: 69Mb L: 38/38 MS: 1 ShuffleBytes- 00:09:53.544 [2024-07-24 13:22:12.210078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.544 [2024-07-24 13:22:12.210121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.544 [2024-07-24 13:22:12.210171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.544 [2024-07-24 13:22:12.210195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.544 [2024-07-24 13:22:12.210251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.544 [2024-07-24 13:22:12.210275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:53.544 #30 NEW cov: 11742 ft: 13116 corp: 9/294b lim: 40 exec/s: 30 rss: 69Mb L: 27/38 MS: 1 EraseBytes- 00:09:53.544 [2024-07-24 13:22:12.300243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.544 [2024-07-24 13:22:12.300286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.544 [2024-07-24 13:22:12.300335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.544 [2024-07-24 13:22:12.300360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.544 #31 NEW cov: 11742 ft: 13377 corp: 10/317b lim: 40 exec/s: 31 rss: 69Mb L: 23/38 MS: 1 EraseBytes- 00:09:53.544 [2024-07-24 13:22:12.390642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.544 [2024-07-24 13:22:12.390684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.544 [2024-07-24 13:22:12.390735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:78787878 cdw11:78797878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.544 [2024-07-24 13:22:12.390760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.544 [2024-07-24 13:22:12.390806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.544 [2024-07-24 13:22:12.390830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:53.544 [2024-07-24 13:22:12.390874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.544 [2024-07-24 13:22:12.390898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:53.803 #32 NEW cov: 11742 ft: 13436 corp: 11/355b lim: 40 exec/s: 32 rss: 69Mb L: 38/38 MS: 1 ChangeBit- 00:09:53.803 [2024-07-24 13:22:12.460750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:f0f0f0f0 cdw11:f0f0f0f0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.803 [2024-07-24 13:22:12.460793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.803 [2024-07-24 13:22:12.460843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:f0f0f0f0 cdw11:f0f0f0f0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.803 [2024-07-24 13:22:12.460872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.803 [2024-07-24 13:22:12.460918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:f0f0f0f0 cdw11:f0f0f0f0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.803 [2024-07-24 13:22:12.460942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:53.803 #33 NEW cov: 11742 ft: 13470 corp: 12/385b lim: 40 exec/s: 33 rss: 69Mb L: 30/38 MS: 1 InsertRepeatedBytes- 00:09:53.803 [2024-07-24 13:22:12.540892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.803 [2024-07-24 13:22:12.540933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.803 [2024-07-24 13:22:12.540983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:78781778 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.803 [2024-07-24 13:22:12.541007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.803 #34 NEW cov: 11742 ft: 13489 corp: 13/408b lim: 40 exec/s: 34 rss: 69Mb L: 23/38 MS: 1 ChangeBinInt- 00:09:53.803 [2024-07-24 13:22:12.631287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.803 [2024-07-24 13:22:12.631329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.803 [2024-07-24 13:22:12.631380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.803 [2024-07-24 13:22:12.631403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.803 [2024-07-24 13:22:12.631449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.803 [2024-07-24 13:22:12.631473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:53.803 [2024-07-24 13:22:12.631518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.803 [2024-07-24 13:22:12.631541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:54.062 #35 NEW cov: 11742 ft: 13509 corp: 14/446b lim: 40 exec/s: 35 rss: 69Mb L: 38/38 MS: 1 ShuffleBytes- 00:09:54.063 [2024-07-24 13:22:12.701289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.063 [2024-07-24 13:22:12.701331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:54.063 [2024-07-24 13:22:12.701380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:78781778 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.063 [2024-07-24 13:22:12.701404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:54.063 #36 NEW cov: 11742 ft: 13541 corp: 15/469b lim: 40 exec/s: 36 rss: 69Mb L: 23/38 MS: 1 ShuffleBytes- 00:09:54.063 [2024-07-24 13:22:12.791721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.063 [2024-07-24 13:22:12.791763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:54.063 [2024-07-24 13:22:12.791814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.063 [2024-07-24 13:22:12.791843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:54.063 [2024-07-24 13:22:12.791888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.063 [2024-07-24 13:22:12.791911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:54.063 [2024-07-24 13:22:12.791956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:78787878 cdw11:40787c78 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.063 [2024-07-24 13:22:12.791981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:54.063 #37 NEW cov: 11742 ft: 13646 corp: 16/507b lim: 40 exec/s: 37 rss: 69Mb L: 38/38 MS: 1 ChangeBit- 00:09:54.063 [2024-07-24 13:22:12.861825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:f0f0f0f0 cdw11:f0f0f0f0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.063 [2024-07-24 13:22:12.861867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:54.063 [2024-07-24 13:22:12.861917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:f0f0f0f0 cdw11:f0f0f0f0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.063 [2024-07-24 13:22:12.861941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:54.063 [2024-07-24 13:22:12.861987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:f0f0f0f0 cdw11:f0f0f0f0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.063 [2024-07-24 13:22:12.862011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:54.063 #38 NEW cov: 11742 ft: 13658 corp: 17/534b lim: 40 exec/s: 38 rss: 70Mb L: 27/38 MS: 1 EraseBytes- 00:09:54.322 [2024-07-24 13:22:12.942964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:f0f0f0f0 cdw11:f0f0f0f0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.322 [2024-07-24 13:22:12.942991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:54.322 [2024-07-24 13:22:12.943052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:f0f0f0f0 cdw11:f0f0f0f0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.322 [2024-07-24 13:22:12.943067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:54.322 [2024-07-24 13:22:12.943126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:f0000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.322 [2024-07-24 13:22:12.943140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:54.322 [2024-07-24 13:22:12.943199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0000f0f0 cdw11:f0f0f0f0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.322 [2024-07-24 13:22:12.943216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:54.322 #39 NEW cov: 11742 ft: 13855 corp: 18/570b lim: 40 exec/s: 39 rss: 70Mb L: 36/38 MS: 1 InsertRepeatedBytes- 00:09:54.322 [2024-07-24 13:22:12.983099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.322 [2024-07-24 13:22:12.983124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:54.322 [2024-07-24 13:22:12.983189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:78787878 cdw11:78797878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.322 [2024-07-24 13:22:12.983203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:54.322 [2024-07-24 13:22:12.983230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:78788887 cdw11:877f7878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.322 [2024-07-24 13:22:12.983244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:54.322 [2024-07-24 13:22:12.983301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.322 [2024-07-24 13:22:12.983315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:54.322 #40 NEW cov: 11742 ft: 13918 corp: 19/608b lim: 40 exec/s: 40 rss: 70Mb L: 38/38 MS: 1 ChangeBinInt- 00:09:54.322 [2024-07-24 13:22:13.023223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.322 [2024-07-24 13:22:13.023249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:54.322 [2024-07-24 13:22:13.023309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.322 [2024-07-24 13:22:13.023324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:54.322 [2024-07-24 13:22:13.023386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.322 [2024-07-24 13:22:13.023400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:54.322 [2024-07-24 13:22:13.023457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:78787826 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.322 [2024-07-24 13:22:13.023471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:54.322 #46 NEW cov: 11742 ft: 13940 corp: 20/646b lim: 40 exec/s: 46 rss: 70Mb L: 38/38 MS: 1 ChangeBinInt- 00:09:54.322 [2024-07-24 13:22:13.063332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:02787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.322 [2024-07-24 13:22:13.063359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:54.322 [2024-07-24 13:22:13.063420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.322 [2024-07-24 13:22:13.063435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:54.322 [2024-07-24 13:22:13.063493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.322 [2024-07-24 13:22:13.063507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:54.323 [2024-07-24 13:22:13.063567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.323 [2024-07-24 13:22:13.063581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:54.323 #47 NEW cov: 11749 ft: 13952 corp: 21/684b lim: 40 exec/s: 47 rss: 70Mb L: 38/38 MS: 1 ChangeBinInt- 00:09:54.323 [2024-07-24 13:22:13.103065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a781700 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.323 [2024-07-24 13:22:13.103091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:54.323 [2024-07-24 13:22:13.103152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00007878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.323 [2024-07-24 13:22:13.103167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:54.323 #48 NEW cov: 11749 ft: 13966 corp: 22/707b lim: 40 exec/s: 48 rss: 70Mb L: 23/38 MS: 1 ChangeBinInt- 00:09:54.323 [2024-07-24 13:22:13.143531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.323 [2024-07-24 13:22:13.143559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:54.323 [2024-07-24 13:22:13.143619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:78787878 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.323 [2024-07-24 13:22:13.143633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:54.323 [2024-07-24 13:22:13.143691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:78787878 cdw11:789c7878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.323 [2024-07-24 13:22:13.143705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:54.323 [2024-07-24 13:22:13.143764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:78787826 cdw11:78787878 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.323 [2024-07-24 13:22:13.143778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:54.323 #49 NEW cov: 11749 ft: 13970 corp: 23/745b lim: 40 exec/s: 24 rss: 70Mb L: 38/38 MS: 1 ChangeByte- 00:09:54.323 #49 DONE cov: 11749 ft: 13970 corp: 23/745b lim: 40 exec/s: 24 rss: 70Mb 00:09:54.323 Done 49 runs in 2 second(s) 00:09:54.582 13:22:13 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:09:54.582 13:22:13 -- ../common.sh@72 -- # (( i++ )) 00:09:54.582 13:22:13 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:54.582 13:22:13 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:09:54.582 13:22:13 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:09:54.582 13:22:13 -- nvmf/run.sh@24 -- # local timen=1 00:09:54.582 13:22:13 -- nvmf/run.sh@25 -- # local core=0x1 00:09:54.582 13:22:13 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:09:54.582 13:22:13 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:09:54.582 13:22:13 -- nvmf/run.sh@29 -- # printf %02d 13 00:09:54.582 13:22:13 -- nvmf/run.sh@29 -- # port=4413 00:09:54.582 13:22:13 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:09:54.582 13:22:13 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:09:54.582 13:22:13 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:54.582 13:22:13 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:09:54.582 [2024-07-24 13:22:13.350358] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:54.582 [2024-07-24 13:22:13.350437] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3171755 ] 00:09:54.582 EAL: No free 2048 kB hugepages reported on node 1 00:09:54.840 [2024-07-24 13:22:13.700058] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:55.098 [2024-07-24 13:22:13.733513] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:55.098 [2024-07-24 13:22:13.733695] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:55.098 [2024-07-24 13:22:13.788461] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:55.098 [2024-07-24 13:22:13.804697] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:09:55.098 INFO: Running with entropic power schedule (0xFF, 100). 00:09:55.098 INFO: Seed: 60643547 00:09:55.098 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:09:55.098 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:09:55.098 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:09:55.098 INFO: A corpus is not provided, starting from an empty corpus 00:09:55.098 #2 INITED exec/s: 0 rss: 61Mb 00:09:55.098 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:55.098 This may also happen if the target rejected all inputs we tried so far 00:09:55.098 [2024-07-24 13:22:13.860658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:520aff00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.098 [2024-07-24 13:22:13.860698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.098 [2024-07-24 13:22:13.860774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.098 [2024-07-24 13:22:13.860795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.098 [2024-07-24 13:22:13.860869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.098 [2024-07-24 13:22:13.860889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:55.665 NEW_FUNC[1/670]: 0x4b0b80 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:09:55.665 NEW_FUNC[2/670]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:55.665 #12 NEW cov: 11509 ft: 11510 corp: 2/30b lim: 40 exec/s: 0 rss: 68Mb L: 29/29 MS: 5 ChangeByte-CrossOver-ChangeByte-InsertByte-InsertRepeatedBytes- 00:09:55.665 [2024-07-24 13:22:14.331720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.665 [2024-07-24 13:22:14.331767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.665 [2024-07-24 13:22:14.331841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.665 [2024-07-24 13:22:14.331861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.665 #14 NEW cov: 11622 ft: 12223 corp: 3/51b lim: 40 exec/s: 0 rss: 68Mb L: 21/29 MS: 2 CMP-InsertRepeatedBytes- DE: "\377\377\377\022"- 00:09:55.665 [2024-07-24 13:22:14.381882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:520aff00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.665 [2024-07-24 13:22:14.381919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.665 [2024-07-24 13:22:14.381996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.665 [2024-07-24 13:22:14.382021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.665 [2024-07-24 13:22:14.382099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.665 [2024-07-24 13:22:14.382119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:55.665 #15 NEW cov: 11628 ft: 12454 corp: 4/80b lim: 40 exec/s: 0 rss: 68Mb L: 29/29 MS: 1 ShuffleBytes- 00:09:55.665 [2024-07-24 13:22:14.441950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.665 [2024-07-24 13:22:14.441985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.665 [2024-07-24 13:22:14.442058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.665 [2024-07-24 13:22:14.442079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.665 #16 NEW cov: 11713 ft: 12673 corp: 5/101b lim: 40 exec/s: 0 rss: 69Mb L: 21/29 MS: 1 ChangeBit- 00:09:55.665 [2024-07-24 13:22:14.502275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:520aff00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.665 [2024-07-24 13:22:14.502309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.665 [2024-07-24 13:22:14.502384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.665 [2024-07-24 13:22:14.502405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.665 [2024-07-24 13:22:14.502480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.665 [2024-07-24 13:22:14.502499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:55.924 #17 NEW cov: 11713 ft: 12765 corp: 6/130b lim: 40 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 ChangeBit- 00:09:55.924 [2024-07-24 13:22:14.562482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:520aff00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.924 [2024-07-24 13:22:14.562517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.924 [2024-07-24 13:22:14.562594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.924 [2024-07-24 13:22:14.562615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.924 [2024-07-24 13:22:14.562689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.924 [2024-07-24 13:22:14.562709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:55.924 #18 NEW cov: 11713 ft: 12822 corp: 7/159b lim: 40 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 ShuffleBytes- 00:09:55.924 [2024-07-24 13:22:14.622640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:520aff00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.924 [2024-07-24 13:22:14.622678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.924 [2024-07-24 13:22:14.622755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.925 [2024-07-24 13:22:14.622776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.925 [2024-07-24 13:22:14.622851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00770000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.925 [2024-07-24 13:22:14.622871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:55.925 #19 NEW cov: 11713 ft: 12894 corp: 8/188b lim: 40 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 ChangeByte- 00:09:55.925 [2024-07-24 13:22:14.672787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:520aff00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.925 [2024-07-24 13:22:14.672822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.925 [2024-07-24 13:22:14.672900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.925 [2024-07-24 13:22:14.672921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.925 [2024-07-24 13:22:14.672993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.925 [2024-07-24 13:22:14.673013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:55.925 #20 NEW cov: 11713 ft: 12933 corp: 9/217b lim: 40 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 ChangeBinInt- 00:09:55.925 [2024-07-24 13:22:14.722831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.925 [2024-07-24 13:22:14.722865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.925 [2024-07-24 13:22:14.722944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00ffffff cdw11:12000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.925 [2024-07-24 13:22:14.722965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.925 [2024-07-24 13:22:14.723039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ffffff52 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.925 [2024-07-24 13:22:14.723059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:55.925 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:55.925 #21 NEW cov: 11736 ft: 12975 corp: 10/242b lim: 40 exec/s: 0 rss: 69Mb L: 25/29 MS: 1 PersAutoDict- DE: "\377\377\377\022"- 00:09:55.925 [2024-07-24 13:22:14.783017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:520aff00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.925 [2024-07-24 13:22:14.783053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.925 [2024-07-24 13:22:14.783127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.925 [2024-07-24 13:22:14.783148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.925 [2024-07-24 13:22:14.783228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.925 [2024-07-24 13:22:14.783249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.184 #22 NEW cov: 11736 ft: 13018 corp: 11/271b lim: 40 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 ShuffleBytes- 00:09:56.184 [2024-07-24 13:22:14.833259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:520aff00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.185 [2024-07-24 13:22:14.833296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.185 [2024-07-24 13:22:14.833371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.185 [2024-07-24 13:22:14.833393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.185 [2024-07-24 13:22:14.833470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00250002 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.185 [2024-07-24 13:22:14.833490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.185 #23 NEW cov: 11736 ft: 13051 corp: 12/300b lim: 40 exec/s: 23 rss: 69Mb L: 29/29 MS: 1 ChangeByte- 00:09:56.185 [2024-07-24 13:22:14.893160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.185 [2024-07-24 13:22:14.893197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.185 [2024-07-24 13:22:14.893280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.185 [2024-07-24 13:22:14.893301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.185 #24 NEW cov: 11736 ft: 13077 corp: 13/322b lim: 40 exec/s: 24 rss: 69Mb L: 22/29 MS: 1 InsertByte- 00:09:56.185 [2024-07-24 13:22:14.943496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:520aff00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.185 [2024-07-24 13:22:14.943532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.185 [2024-07-24 13:22:14.943606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.185 [2024-07-24 13:22:14.943627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.185 [2024-07-24 13:22:14.943699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.185 [2024-07-24 13:22:14.943719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.185 #25 NEW cov: 11736 ft: 13109 corp: 14/349b lim: 40 exec/s: 25 rss: 69Mb L: 27/29 MS: 1 EraseBytes- 00:09:56.185 [2024-07-24 13:22:14.983789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:520aff00 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.185 [2024-07-24 13:22:14.983824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.185 [2024-07-24 13:22:14.983897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.185 [2024-07-24 13:22:14.983921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.185 [2024-07-24 13:22:14.983993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00250002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.185 [2024-07-24 13:22:14.984013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.185 [2024-07-24 13:22:14.984083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.185 [2024-07-24 13:22:14.984102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:56.185 #26 NEW cov: 11736 ft: 13587 corp: 15/382b lim: 40 exec/s: 26 rss: 69Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:09:56.185 [2024-07-24 13:22:15.043459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.185 [2024-07-24 13:22:15.043495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.444 #27 NEW cov: 11736 ft: 13919 corp: 16/396b lim: 40 exec/s: 27 rss: 69Mb L: 14/33 MS: 1 EraseBytes- 00:09:56.444 [2024-07-24 13:22:15.093898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:520aff00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.444 [2024-07-24 13:22:15.093935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.444 [2024-07-24 13:22:15.094011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.444 [2024-07-24 13:22:15.094031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.444 [2024-07-24 13:22:15.094105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.444 [2024-07-24 13:22:15.094126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.444 #28 NEW cov: 11736 ft: 13955 corp: 17/425b lim: 40 exec/s: 28 rss: 70Mb L: 29/33 MS: 1 ChangeByte- 00:09:56.444 [2024-07-24 13:22:15.154085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:520aff00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.444 [2024-07-24 13:22:15.154121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.444 [2024-07-24 13:22:15.154198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.444 [2024-07-24 13:22:15.154225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.444 [2024-07-24 13:22:15.154294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.444 [2024-07-24 13:22:15.154315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.444 #29 NEW cov: 11736 ft: 13967 corp: 18/454b lim: 40 exec/s: 29 rss: 70Mb L: 29/33 MS: 1 ShuffleBytes- 00:09:56.444 [2024-07-24 13:22:15.204265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.444 [2024-07-24 13:22:15.204299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.444 [2024-07-24 13:22:15.204378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00ffffff cdw11:12000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.444 [2024-07-24 13:22:15.204399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.444 [2024-07-24 13:22:15.204470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ffffff52 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.444 [2024-07-24 13:22:15.204489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.445 #30 NEW cov: 11736 ft: 14015 corp: 19/479b lim: 40 exec/s: 30 rss: 70Mb L: 25/33 MS: 1 CopyPart- 00:09:56.445 [2024-07-24 13:22:15.264544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:52010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.445 [2024-07-24 13:22:15.264579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.445 [2024-07-24 13:22:15.264650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:020aff00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.445 [2024-07-24 13:22:15.264671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.445 [2024-07-24 13:22:15.264742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.445 [2024-07-24 13:22:15.264762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.445 [2024-07-24 13:22:15.264832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.445 [2024-07-24 13:22:15.264852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:56.445 #31 NEW cov: 11736 ft: 14027 corp: 20/516b lim: 40 exec/s: 31 rss: 70Mb L: 37/37 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\002"- 00:09:56.703 [2024-07-24 13:22:15.314569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:520aff00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.703 [2024-07-24 13:22:15.314603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.703 [2024-07-24 13:22:15.314681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00470000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.703 [2024-07-24 13:22:15.314702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.703 [2024-07-24 13:22:15.314777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.703 [2024-07-24 13:22:15.314797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.703 #32 NEW cov: 11736 ft: 14054 corp: 21/545b lim: 40 exec/s: 32 rss: 70Mb L: 29/37 MS: 1 CMP- DE: "G\000\000\000\000\000\000\000"- 00:09:56.703 [2024-07-24 13:22:15.354698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00050000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.703 [2024-07-24 13:22:15.354733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.703 [2024-07-24 13:22:15.354810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.703 [2024-07-24 13:22:15.354835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.703 [2024-07-24 13:22:15.354910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:12000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.703 [2024-07-24 13:22:15.354930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.703 #33 NEW cov: 11736 ft: 14061 corp: 22/574b lim: 40 exec/s: 33 rss: 70Mb L: 29/37 MS: 1 CMP- DE: "\000\000\000\005"- 00:09:56.703 [2024-07-24 13:22:15.404873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.703 [2024-07-24 13:22:15.404907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.703 [2024-07-24 13:22:15.404983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:47000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.703 [2024-07-24 13:22:15.405004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.704 [2024-07-24 13:22:15.405078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.704 [2024-07-24 13:22:15.405099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.704 #34 NEW cov: 11736 ft: 14076 corp: 23/603b lim: 40 exec/s: 34 rss: 70Mb L: 29/37 MS: 1 PersAutoDict- DE: "G\000\000\000\000\000\000\000"- 00:09:56.704 [2024-07-24 13:22:15.454975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.704 [2024-07-24 13:22:15.455012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.704 [2024-07-24 13:22:15.455089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00ffffff cdw11:12000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.704 [2024-07-24 13:22:15.455111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.704 [2024-07-24 13:22:15.455186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.704 [2024-07-24 13:22:15.455206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.704 #35 NEW cov: 11736 ft: 14084 corp: 24/628b lim: 40 exec/s: 35 rss: 70Mb L: 25/37 MS: 1 CopyPart- 00:09:56.704 [2024-07-24 13:22:15.495117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:520aff00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.704 [2024-07-24 13:22:15.495151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.704 [2024-07-24 13:22:15.495233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.704 [2024-07-24 13:22:15.495254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.704 [2024-07-24 13:22:15.495328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.704 [2024-07-24 13:22:15.495349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.704 #36 NEW cov: 11736 ft: 14091 corp: 25/657b lim: 40 exec/s: 36 rss: 70Mb L: 29/37 MS: 1 ShuffleBytes- 00:09:56.704 [2024-07-24 13:22:15.535245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:520aff00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.704 [2024-07-24 13:22:15.535280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.704 [2024-07-24 13:22:15.535358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.704 [2024-07-24 13:22:15.535390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.704 [2024-07-24 13:22:15.535466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:faffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.704 [2024-07-24 13:22:15.535486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.704 #37 NEW cov: 11736 ft: 14105 corp: 26/686b lim: 40 exec/s: 37 rss: 70Mb L: 29/37 MS: 1 ChangeBinInt- 00:09:56.963 [2024-07-24 13:22:15.575345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.963 [2024-07-24 13:22:15.575379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.963 [2024-07-24 13:22:15.575455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00ffffff cdw11:12000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.963 [2024-07-24 13:22:15.575476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.963 [2024-07-24 13:22:15.575546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ffffff52 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.963 [2024-07-24 13:22:15.575565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.963 #38 NEW cov: 11736 ft: 14108 corp: 27/711b lim: 40 exec/s: 38 rss: 70Mb L: 25/37 MS: 1 ShuffleBytes- 00:09:56.963 [2024-07-24 13:22:15.635720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:520aff00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.963 [2024-07-24 13:22:15.635754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.963 [2024-07-24 13:22:15.635829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.963 [2024-07-24 13:22:15.635850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.963 [2024-07-24 13:22:15.635920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:faffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.963 [2024-07-24 13:22:15.635939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.963 [2024-07-24 13:22:15.636010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.963 [2024-07-24 13:22:15.636030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:56.963 #39 NEW cov: 11736 ft: 14124 corp: 28/744b lim: 40 exec/s: 39 rss: 70Mb L: 33/37 MS: 1 CopyPart- 00:09:56.963 [2024-07-24 13:22:15.695716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:522aff00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.963 [2024-07-24 13:22:15.695750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.963 [2024-07-24 13:22:15.695831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00470000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.963 [2024-07-24 13:22:15.695851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.963 [2024-07-24 13:22:15.695927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.963 [2024-07-24 13:22:15.695947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.963 #40 NEW cov: 11736 ft: 14146 corp: 29/773b lim: 40 exec/s: 40 rss: 70Mb L: 29/37 MS: 1 ChangeBit- 00:09:56.963 [2024-07-24 13:22:15.755849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.963 [2024-07-24 13:22:15.755883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.963 [2024-07-24 13:22:15.755960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00ff0000 cdw11:ffff1200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.963 [2024-07-24 13:22:15.755982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.963 [2024-07-24 13:22:15.756055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ffffff52 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.963 [2024-07-24 13:22:15.756075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.963 #41 NEW cov: 11736 ft: 14151 corp: 30/798b lim: 40 exec/s: 41 rss: 70Mb L: 25/37 MS: 1 ShuffleBytes- 00:09:56.963 [2024-07-24 13:22:15.795971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.963 [2024-07-24 13:22:15.796005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.963 [2024-07-24 13:22:15.796079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00ffffff cdw11:12000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.963 [2024-07-24 13:22:15.796101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.963 [2024-07-24 13:22:15.796176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ff31ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.963 [2024-07-24 13:22:15.796195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.963 #42 NEW cov: 11736 ft: 14154 corp: 31/824b lim: 40 exec/s: 42 rss: 70Mb L: 26/37 MS: 1 InsertByte- 00:09:57.223 [2024-07-24 13:22:15.846281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:57.223 [2024-07-24 13:22:15.846316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:57.223 [2024-07-24 13:22:15.846388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:520aff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:57.223 [2024-07-24 13:22:15.846408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:57.223 [2024-07-24 13:22:15.846483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:57.223 [2024-07-24 13:22:15.846506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:57.223 [2024-07-24 13:22:15.846584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:57.223 [2024-07-24 13:22:15.846604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:57.223 #43 NEW cov: 11736 ft: 14159 corp: 32/859b lim: 40 exec/s: 21 rss: 70Mb L: 35/37 MS: 1 CrossOver- 00:09:57.223 #43 DONE cov: 11736 ft: 14159 corp: 32/859b lim: 40 exec/s: 21 rss: 70Mb 00:09:57.223 ###### Recommended dictionary. ###### 00:09:57.223 "\377\377\377\022" # Uses: 1 00:09:57.223 "\001\000\000\000\000\000\000\002" # Uses: 0 00:09:57.223 "G\000\000\000\000\000\000\000" # Uses: 1 00:09:57.223 "\000\000\000\005" # Uses: 0 00:09:57.223 ###### End of recommended dictionary. ###### 00:09:57.223 Done 43 runs in 2 second(s) 00:09:57.223 13:22:16 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:09:57.223 13:22:16 -- ../common.sh@72 -- # (( i++ )) 00:09:57.223 13:22:16 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:57.223 13:22:16 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:09:57.223 13:22:16 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:09:57.223 13:22:16 -- nvmf/run.sh@24 -- # local timen=1 00:09:57.223 13:22:16 -- nvmf/run.sh@25 -- # local core=0x1 00:09:57.223 13:22:16 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:09:57.223 13:22:16 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:09:57.223 13:22:16 -- nvmf/run.sh@29 -- # printf %02d 14 00:09:57.223 13:22:16 -- nvmf/run.sh@29 -- # port=4414 00:09:57.223 13:22:16 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:09:57.223 13:22:16 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:09:57.223 13:22:16 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:57.223 13:22:16 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:09:57.223 [2024-07-24 13:22:16.056474] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:09:57.223 [2024-07-24 13:22:16.056562] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3172122 ] 00:09:57.482 EAL: No free 2048 kB hugepages reported on node 1 00:09:57.741 [2024-07-24 13:22:16.407092] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:57.741 [2024-07-24 13:22:16.437012] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:57.741 [2024-07-24 13:22:16.437188] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:57.741 [2024-07-24 13:22:16.491791] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:57.741 [2024-07-24 13:22:16.508021] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:09:57.741 INFO: Running with entropic power schedule (0xFF, 100). 00:09:57.741 INFO: Seed: 2762639463 00:09:57.741 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:09:57.741 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:09:57.741 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:09:57.741 INFO: A corpus is not provided, starting from an empty corpus 00:09:57.741 #2 INITED exec/s: 0 rss: 61Mb 00:09:57.741 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:57.741 This may also happen if the target rejected all inputs we tried so far 00:09:57.741 [2024-07-24 13:22:16.557880] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.741 [2024-07-24 13:22:16.557924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:57.741 [2024-07-24 13:22:16.557999] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.741 [2024-07-24 13:22:16.558020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:57.741 [2024-07-24 13:22:16.558094] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.741 [2024-07-24 13:22:16.558112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:57.741 [2024-07-24 13:22:16.558187] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.741 [2024-07-24 13:22:16.558208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:57.741 [2024-07-24 13:22:16.558287] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.741 [2024-07-24 13:22:16.558307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:58.309 NEW_FUNC[1/671]: 0x4b2740 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:09:58.309 NEW_FUNC[2/671]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:58.309 #6 NEW cov: 11503 ft: 11504 corp: 2/36b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 4 CrossOver-InsertByte-ChangeByte-InsertRepeatedBytes- 00:09:58.309 [2024-07-24 13:22:17.040899] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.309 [2024-07-24 13:22:17.040947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:58.309 [2024-07-24 13:22:17.041058] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.309 [2024-07-24 13:22:17.041079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.309 [2024-07-24 13:22:17.041184] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.309 [2024-07-24 13:22:17.041205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:58.309 [2024-07-24 13:22:17.041316] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.309 [2024-07-24 13:22:17.041336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:58.309 #11 NEW cov: 11616 ft: 12077 corp: 3/69b lim: 35 exec/s: 0 rss: 68Mb L: 33/35 MS: 5 CrossOver-CopyPart-EraseBytes-InsertByte-CrossOver- 00:09:58.309 [2024-07-24 13:22:17.101029] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.309 [2024-07-24 13:22:17.101065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:58.309 [2024-07-24 13:22:17.101169] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.309 [2024-07-24 13:22:17.101190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.309 [2024-07-24 13:22:17.101296] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.309 [2024-07-24 13:22:17.101325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:58.309 [2024-07-24 13:22:17.101427] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.309 [2024-07-24 13:22:17.101446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:58.309 #17 NEW cov: 11622 ft: 12349 corp: 4/103b lim: 35 exec/s: 0 rss: 68Mb L: 34/35 MS: 1 InsertByte- 00:09:58.309 [2024-07-24 13:22:17.171475] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.309 [2024-07-24 13:22:17.171511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:58.309 [2024-07-24 13:22:17.171616] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.309 [2024-07-24 13:22:17.171637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.309 [2024-07-24 13:22:17.171747] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.309 [2024-07-24 13:22:17.171768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:58.309 [2024-07-24 13:22:17.171877] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.309 [2024-07-24 13:22:17.171896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:58.568 #23 NEW cov: 11707 ft: 12626 corp: 5/137b lim: 35 exec/s: 0 rss: 68Mb L: 34/35 MS: 1 InsertByte- 00:09:58.568 [2024-07-24 13:22:17.231458] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.568 [2024-07-24 13:22:17.231493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.568 [2024-07-24 13:22:17.231603] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.568 [2024-07-24 13:22:17.231623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:58.568 NEW_FUNC[1/2]: 0x4d3ae0 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:09:58.568 NEW_FUNC[2/2]: 0x11719a0 in nvmf_ctrlr_set_features_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1651 00:09:58.568 #24 NEW cov: 11740 ft: 12966 corp: 6/161b lim: 35 exec/s: 0 rss: 68Mb L: 24/35 MS: 1 InsertRepeatedBytes- 00:09:58.568 [2024-07-24 13:22:17.302421] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.568 [2024-07-24 13:22:17.302458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:58.568 [2024-07-24 13:22:17.302563] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.568 [2024-07-24 13:22:17.302584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.568 [2024-07-24 13:22:17.302692] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.568 [2024-07-24 13:22:17.302713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:58.568 [2024-07-24 13:22:17.302823] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.568 [2024-07-24 13:22:17.302848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:58.568 [2024-07-24 13:22:17.302952] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.568 [2024-07-24 13:22:17.302973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:58.568 #30 NEW cov: 11740 ft: 13055 corp: 7/196b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 ShuffleBytes- 00:09:58.568 [2024-07-24 13:22:17.372023] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.568 [2024-07-24 13:22:17.372058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.568 [2024-07-24 13:22:17.372163] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.569 [2024-07-24 13:22:17.372184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:58.569 #36 NEW cov: 11740 ft: 13083 corp: 8/220b lim: 35 exec/s: 0 rss: 69Mb L: 24/35 MS: 1 CrossOver- 00:09:58.828 [2024-07-24 13:22:17.442328] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.828 [2024-07-24 13:22:17.442363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.828 [2024-07-24 13:22:17.442476] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000038 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.828 [2024-07-24 13:22:17.442496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:58.828 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:58.828 #37 NEW cov: 11763 ft: 13133 corp: 9/244b lim: 35 exec/s: 0 rss: 69Mb L: 24/35 MS: 1 ChangeASCIIInt- 00:09:58.828 [2024-07-24 13:22:17.512868] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.828 [2024-07-24 13:22:17.512903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:58.828 [2024-07-24 13:22:17.513013] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.828 [2024-07-24 13:22:17.513033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.828 [2024-07-24 13:22:17.513137] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.828 [2024-07-24 13:22:17.513158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:58.828 [2024-07-24 13:22:17.513256] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.828 [2024-07-24 13:22:17.513276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:58.828 #38 NEW cov: 11763 ft: 13188 corp: 10/278b lim: 35 exec/s: 38 rss: 69Mb L: 34/35 MS: 1 ChangeBit- 00:09:58.828 [2024-07-24 13:22:17.583528] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.828 [2024-07-24 13:22:17.583564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:58.828 [2024-07-24 13:22:17.583672] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.828 [2024-07-24 13:22:17.583697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.828 [2024-07-24 13:22:17.583811] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.828 [2024-07-24 13:22:17.583832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:58.828 [2024-07-24 13:22:17.583943] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.828 [2024-07-24 13:22:17.583963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:58.828 #39 NEW cov: 11763 ft: 13214 corp: 11/312b lim: 35 exec/s: 39 rss: 69Mb L: 34/35 MS: 1 ChangeASCIIInt- 00:09:58.828 [2024-07-24 13:22:17.643693] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.828 [2024-07-24 13:22:17.643728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.828 [2024-07-24 13:22:17.643847] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000038 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.828 [2024-07-24 13:22:17.643868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:58.828 #40 NEW cov: 11763 ft: 13254 corp: 12/337b lim: 35 exec/s: 40 rss: 69Mb L: 25/35 MS: 1 InsertByte- 00:09:59.087 [2024-07-24 13:22:17.714099] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.087 [2024-07-24 13:22:17.714135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:59.087 [2024-07-24 13:22:17.714258] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.087 [2024-07-24 13:22:17.714281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:59.087 #41 NEW cov: 11763 ft: 13315 corp: 13/361b lim: 35 exec/s: 41 rss: 69Mb L: 24/35 MS: 1 ChangeByte- 00:09:59.087 [2024-07-24 13:22:17.774484] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.087 [2024-07-24 13:22:17.774521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:59.087 [2024-07-24 13:22:17.774624] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.087 [2024-07-24 13:22:17.774647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:59.087 #42 NEW cov: 11763 ft: 13333 corp: 14/385b lim: 35 exec/s: 42 rss: 69Mb L: 24/35 MS: 1 ChangeBit- 00:09:59.087 [2024-07-24 13:22:17.845576] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.087 [2024-07-24 13:22:17.845612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:59.087 [2024-07-24 13:22:17.845725] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000028 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.087 [2024-07-24 13:22:17.845746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:59.087 [2024-07-24 13:22:17.845848] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.087 [2024-07-24 13:22:17.845869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:59.087 [2024-07-24 13:22:17.845979] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000038 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.087 [2024-07-24 13:22:17.846000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:59.087 #43 NEW cov: 11763 ft: 13373 corp: 15/420b lim: 35 exec/s: 43 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:09:59.087 [2024-07-24 13:22:17.915196] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.087 [2024-07-24 13:22:17.915237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:59.087 [2024-07-24 13:22:17.915344] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.087 [2024-07-24 13:22:17.915366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:59.087 #49 NEW cov: 11763 ft: 13399 corp: 16/444b lim: 35 exec/s: 49 rss: 69Mb L: 24/35 MS: 1 CMP- DE: "T_\012\002\000\000\000\000"- 00:09:59.346 [2024-07-24 13:22:17.976475] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.346 [2024-07-24 13:22:17.976510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:59.346 [2024-07-24 13:22:17.976620] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.346 [2024-07-24 13:22:17.976641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:59.346 [2024-07-24 13:22:17.976745] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.346 [2024-07-24 13:22:17.976766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:59.346 [2024-07-24 13:22:17.976869] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.346 [2024-07-24 13:22:17.976889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:59.346 [2024-07-24 13:22:17.976996] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.346 [2024-07-24 13:22:17.977016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:59.346 #50 NEW cov: 11763 ft: 13434 corp: 17/479b lim: 35 exec/s: 50 rss: 69Mb L: 35/35 MS: 1 InsertByte- 00:09:59.346 [2024-07-24 13:22:18.046619] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.346 [2024-07-24 13:22:18.046653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:59.346 [2024-07-24 13:22:18.046761] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.346 [2024-07-24 13:22:18.046782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:59.346 [2024-07-24 13:22:18.046892] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.346 [2024-07-24 13:22:18.046911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:59.346 [2024-07-24 13:22:18.047010] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000002c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.346 [2024-07-24 13:22:18.047031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:59.346 [2024-07-24 13:22:18.047136] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.346 [2024-07-24 13:22:18.047157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:59.346 #51 NEW cov: 11763 ft: 13505 corp: 18/514b lim: 35 exec/s: 51 rss: 69Mb L: 35/35 MS: 1 CrossOver- 00:09:59.346 [2024-07-24 13:22:18.106371] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.346 [2024-07-24 13:22:18.106406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:59.346 [2024-07-24 13:22:18.106516] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.346 [2024-07-24 13:22:18.106538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:59.346 #52 NEW cov: 11763 ft: 13515 corp: 19/539b lim: 35 exec/s: 52 rss: 70Mb L: 25/35 MS: 1 CopyPart- 00:09:59.346 [2024-07-24 13:22:18.167562] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.346 [2024-07-24 13:22:18.167597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:59.346 [2024-07-24 13:22:18.167708] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.346 [2024-07-24 13:22:18.167730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:59.346 [2024-07-24 13:22:18.167834] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.346 [2024-07-24 13:22:18.167854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:59.346 [2024-07-24 13:22:18.167964] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000038 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.346 [2024-07-24 13:22:18.167984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:59.346 [2024-07-24 13:22:18.168085] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000038 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.346 [2024-07-24 13:22:18.168105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:59.346 #53 NEW cov: 11763 ft: 13552 corp: 20/574b lim: 35 exec/s: 53 rss: 70Mb L: 35/35 MS: 1 ChangeASCIIInt- 00:09:59.606 [2024-07-24 13:22:18.237625] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.606 [2024-07-24 13:22:18.237663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:59.606 [2024-07-24 13:22:18.237763] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.606 [2024-07-24 13:22:18.237785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:59.606 [2024-07-24 13:22:18.237893] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.606 [2024-07-24 13:22:18.237915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:59.606 [2024-07-24 13:22:18.238025] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.606 [2024-07-24 13:22:18.238047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:59.606 #54 NEW cov: 11763 ft: 13553 corp: 21/607b lim: 35 exec/s: 54 rss: 70Mb L: 33/35 MS: 1 CrossOver- 00:09:59.606 [2024-07-24 13:22:18.298387] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.606 [2024-07-24 13:22:18.298423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:59.606 [2024-07-24 13:22:18.298535] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.606 [2024-07-24 13:22:18.298557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:59.606 [2024-07-24 13:22:18.298659] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.606 [2024-07-24 13:22:18.298679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:59.606 [2024-07-24 13:22:18.298790] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.606 [2024-07-24 13:22:18.298811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:59.606 [2024-07-24 13:22:18.298917] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.606 [2024-07-24 13:22:18.298937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:59.606 #55 NEW cov: 11763 ft: 13632 corp: 22/642b lim: 35 exec/s: 55 rss: 70Mb L: 35/35 MS: 1 ShuffleBytes- 00:09:59.606 [2024-07-24 13:22:18.368857] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.606 [2024-07-24 13:22:18.368891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:59.606 [2024-07-24 13:22:18.369008] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.606 [2024-07-24 13:22:18.369030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:59.606 [2024-07-24 13:22:18.369147] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.606 [2024-07-24 13:22:18.369176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:59.606 [2024-07-24 13:22:18.369319] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.606 [2024-07-24 13:22:18.369349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:59.606 [2024-07-24 13:22:18.369489] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.606 [2024-07-24 13:22:18.369518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:59.606 #56 NEW cov: 11763 ft: 13654 corp: 23/677b lim: 35 exec/s: 56 rss: 70Mb L: 35/35 MS: 1 ChangeByte- 00:09:59.606 [2024-07-24 13:22:18.428560] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.606 [2024-07-24 13:22:18.428596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:59.606 [2024-07-24 13:22:18.428698] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000038 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.606 [2024-07-24 13:22:18.428722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:59.606 #57 NEW cov: 11763 ft: 13704 corp: 24/702b lim: 35 exec/s: 57 rss: 70Mb L: 25/35 MS: 1 ShuffleBytes- 00:09:59.866 [2024-07-24 13:22:18.489054] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.866 [2024-07-24 13:22:18.489088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:59.866 [2024-07-24 13:22:18.489199] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.866 [2024-07-24 13:22:18.489225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:59.866 #58 NEW cov: 11763 ft: 13723 corp: 25/727b lim: 35 exec/s: 58 rss: 70Mb L: 25/35 MS: 1 ChangeByte- 00:09:59.866 [2024-07-24 13:22:18.560306] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.866 [2024-07-24 13:22:18.560343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:59.866 [2024-07-24 13:22:18.560461] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.866 [2024-07-24 13:22:18.560482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:59.866 [2024-07-24 13:22:18.560594] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.866 [2024-07-24 13:22:18.560616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:59.866 [2024-07-24 13:22:18.560723] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.866 [2024-07-24 13:22:18.560745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:59.866 [2024-07-24 13:22:18.560856] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:59.866 [2024-07-24 13:22:18.560876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:59.866 #59 NEW cov: 11763 ft: 13725 corp: 26/762b lim: 35 exec/s: 29 rss: 70Mb L: 35/35 MS: 1 InsertByte- 00:09:59.866 #59 DONE cov: 11763 ft: 13725 corp: 26/762b lim: 35 exec/s: 29 rss: 70Mb 00:09:59.866 ###### Recommended dictionary. ###### 00:09:59.866 "T_\012\002\000\000\000\000" # Uses: 0 00:09:59.866 ###### End of recommended dictionary. ###### 00:09:59.866 Done 59 runs in 2 second(s) 00:09:59.866 13:22:18 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:09:59.866 13:22:18 -- ../common.sh@72 -- # (( i++ )) 00:09:59.866 13:22:18 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:59.866 13:22:18 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:09:59.866 13:22:18 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:09:59.866 13:22:18 -- nvmf/run.sh@24 -- # local timen=1 00:09:59.866 13:22:18 -- nvmf/run.sh@25 -- # local core=0x1 00:09:59.866 13:22:18 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:09:59.866 13:22:18 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:09:59.866 13:22:18 -- nvmf/run.sh@29 -- # printf %02d 15 00:09:59.866 13:22:18 -- nvmf/run.sh@29 -- # port=4415 00:09:59.866 13:22:18 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:09:59.866 13:22:18 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:09:59.866 13:22:18 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:10:00.125 13:22:18 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:10:00.125 [2024-07-24 13:22:18.764016] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:10:00.125 [2024-07-24 13:22:18.764086] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3172486 ] 00:10:00.125 EAL: No free 2048 kB hugepages reported on node 1 00:10:00.384 [2024-07-24 13:22:19.108799] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:00.384 [2024-07-24 13:22:19.141357] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:00.384 [2024-07-24 13:22:19.141535] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:00.384 [2024-07-24 13:22:19.196259] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:10:00.384 [2024-07-24 13:22:19.212489] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:10:00.384 INFO: Running with entropic power schedule (0xFF, 100). 00:10:00.384 INFO: Seed: 1174671753 00:10:00.643 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:10:00.643 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:10:00.643 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:10:00.643 INFO: A corpus is not provided, starting from an empty corpus 00:10:00.643 #2 INITED exec/s: 0 rss: 61Mb 00:10:00.643 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:00.643 This may also happen if the target rejected all inputs we tried so far 00:10:00.643 [2024-07-24 13:22:19.268711] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.643 [2024-07-24 13:22:19.268752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:00.643 [2024-07-24 13:22:19.268828] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.643 [2024-07-24 13:22:19.268849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:00.643 [2024-07-24 13:22:19.268924] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.643 [2024-07-24 13:22:19.268944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:00.904 NEW_FUNC[1/671]: 0x4b3c80 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:10:00.904 NEW_FUNC[2/671]: 0x4d3ae0 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:10:00.904 #4 NEW cov: 11505 ft: 11506 corp: 2/30b lim: 35 exec/s: 0 rss: 68Mb L: 29/29 MS: 2 InsertByte-InsertRepeatedBytes- 00:10:00.904 [2024-07-24 13:22:19.739196] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.904 [2024-07-24 13:22:19.739250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:01.207 #14 NEW cov: 11618 ft: 12644 corp: 3/37b lim: 35 exec/s: 0 rss: 68Mb L: 7/29 MS: 5 ShuffleBytes-CopyPart-InsertByte-CrossOver-CrossOver- 00:10:01.207 [2024-07-24 13:22:19.789886] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.208 [2024-07-24 13:22:19.789926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.208 [2024-07-24 13:22:19.790002] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.208 [2024-07-24 13:22:19.790027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.208 [2024-07-24 13:22:19.790100] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.208 [2024-07-24 13:22:19.790121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:01.208 [2024-07-24 13:22:19.790190] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.208 [2024-07-24 13:22:19.790209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:10:01.208 #20 NEW cov: 11624 ft: 12995 corp: 4/72b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:10:01.208 [2024-07-24 13:22:19.849890] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.208 [2024-07-24 13:22:19.849926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.208 [2024-07-24 13:22:19.849998] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.208 [2024-07-24 13:22:19.850019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.208 [2024-07-24 13:22:19.850089] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.208 [2024-07-24 13:22:19.850108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:01.208 #21 NEW cov: 11709 ft: 13334 corp: 5/101b lim: 35 exec/s: 0 rss: 69Mb L: 29/35 MS: 1 ChangeBit- 00:10:01.208 [2024-07-24 13:22:19.899529] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000025c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.208 [2024-07-24 13:22:19.899563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:01.208 #22 NEW cov: 11709 ft: 13445 corp: 6/114b lim: 35 exec/s: 0 rss: 69Mb L: 13/35 MS: 1 InsertRepeatedBytes- 00:10:01.208 [2024-07-24 13:22:19.960154] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.208 [2024-07-24 13:22:19.960189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.208 [2024-07-24 13:22:19.960267] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.208 [2024-07-24 13:22:19.960289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.208 [2024-07-24 13:22:19.960362] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.208 [2024-07-24 13:22:19.960380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:01.208 #23 NEW cov: 11709 ft: 13576 corp: 7/143b lim: 35 exec/s: 0 rss: 69Mb L: 29/35 MS: 1 CMP- DE: "\002\000"- 00:10:01.208 [2024-07-24 13:22:20.020435] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.208 [2024-07-24 13:22:20.020471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.208 [2024-07-24 13:22:20.020544] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.208 [2024-07-24 13:22:20.020570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.208 [2024-07-24 13:22:20.020643] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.208 [2024-07-24 13:22:20.020663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:01.208 #24 NEW cov: 11709 ft: 13632 corp: 8/173b lim: 35 exec/s: 0 rss: 69Mb L: 30/35 MS: 1 InsertByte- 00:10:01.467 [2024-07-24 13:22:20.080609] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000012d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.467 [2024-07-24 13:22:20.080647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.467 [2024-07-24 13:22:20.080718] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.467 [2024-07-24 13:22:20.080738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.467 [2024-07-24 13:22:20.080810] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.467 [2024-07-24 13:22:20.080829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:01.467 #25 NEW cov: 11709 ft: 13668 corp: 9/203b lim: 35 exec/s: 0 rss: 69Mb L: 30/35 MS: 1 ChangeByte- 00:10:01.467 [2024-07-24 13:22:20.140771] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.467 [2024-07-24 13:22:20.140807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.467 [2024-07-24 13:22:20.140876] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.467 [2024-07-24 13:22:20.140895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.467 [2024-07-24 13:22:20.140968] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.467 [2024-07-24 13:22:20.140987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:01.467 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:01.467 #27 NEW cov: 11732 ft: 13731 corp: 10/235b lim: 35 exec/s: 0 rss: 69Mb L: 32/35 MS: 2 CrossOver-InsertRepeatedBytes- 00:10:01.467 [2024-07-24 13:22:20.190888] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000134 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.467 [2024-07-24 13:22:20.190924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.467 [2024-07-24 13:22:20.190996] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.467 [2024-07-24 13:22:20.191016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.467 [2024-07-24 13:22:20.191084] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.467 [2024-07-24 13:22:20.191103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:01.467 #28 NEW cov: 11732 ft: 13762 corp: 11/265b lim: 35 exec/s: 0 rss: 69Mb L: 30/35 MS: 1 InsertByte- 00:10:01.467 [2024-07-24 13:22:20.241181] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000012d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.467 [2024-07-24 13:22:20.241225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.467 [2024-07-24 13:22:20.241305] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.467 [2024-07-24 13:22:20.241325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.467 [2024-07-24 13:22:20.241399] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.467 [2024-07-24 13:22:20.241418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:01.467 [2024-07-24 13:22:20.241490] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.467 [2024-07-24 13:22:20.241511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:10:01.467 #29 NEW cov: 11732 ft: 13790 corp: 12/300b lim: 35 exec/s: 29 rss: 69Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:10:01.467 [2024-07-24 13:22:20.301199] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.467 [2024-07-24 13:22:20.301241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.467 [2024-07-24 13:22:20.301317] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.467 [2024-07-24 13:22:20.301339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.467 [2024-07-24 13:22:20.301413] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.467 [2024-07-24 13:22:20.301435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:01.467 #30 NEW cov: 11732 ft: 13814 corp: 13/334b lim: 35 exec/s: 30 rss: 69Mb L: 34/35 MS: 1 InsertRepeatedBytes- 00:10:01.726 [2024-07-24 13:22:20.351374] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.726 [2024-07-24 13:22:20.351410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.726 [2024-07-24 13:22:20.351480] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.726 [2024-07-24 13:22:20.351501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.726 [2024-07-24 13:22:20.351572] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.726 [2024-07-24 13:22:20.351591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:01.726 #31 NEW cov: 11732 ft: 13848 corp: 14/367b lim: 35 exec/s: 31 rss: 70Mb L: 33/35 MS: 1 InsertByte- 00:10:01.726 [2024-07-24 13:22:20.411414] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.726 [2024-07-24 13:22:20.411450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:01.726 [2024-07-24 13:22:20.411520] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.726 [2024-07-24 13:22:20.411542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.726 [2024-07-24 13:22:20.411607] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.726 [2024-07-24 13:22:20.411631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.726 [2024-07-24 13:22:20.411701] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.726 [2024-07-24 13:22:20.411722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:01.726 #32 NEW cov: 11732 ft: 13904 corp: 15/399b lim: 35 exec/s: 32 rss: 70Mb L: 32/35 MS: 1 CrossOver- 00:10:01.726 [2024-07-24 13:22:20.461836] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.726 [2024-07-24 13:22:20.461872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.726 [2024-07-24 13:22:20.461948] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.726 [2024-07-24 13:22:20.461968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.726 [2024-07-24 13:22:20.462042] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.726 [2024-07-24 13:22:20.462063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:01.726 [2024-07-24 13:22:20.462134] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.726 [2024-07-24 13:22:20.462154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:10:01.726 #33 NEW cov: 11732 ft: 14032 corp: 16/434b lim: 35 exec/s: 33 rss: 70Mb L: 35/35 MS: 1 ShuffleBytes- 00:10:01.726 [2024-07-24 13:22:20.521515] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.726 [2024-07-24 13:22:20.521550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:01.726 [2024-07-24 13:22:20.521622] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.726 [2024-07-24 13:22:20.521643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.726 #34 NEW cov: 11732 ft: 14255 corp: 17/449b lim: 35 exec/s: 34 rss: 70Mb L: 15/35 MS: 1 CMP- DE: "\000\000\000\000\000\000\000H"- 00:10:01.726 [2024-07-24 13:22:20.571998] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.727 [2024-07-24 13:22:20.572035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.727 [2024-07-24 13:22:20.572110] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.727 [2024-07-24 13:22:20.572131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.727 [2024-07-24 13:22:20.572199] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.727 [2024-07-24 13:22:20.572225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:01.985 #35 NEW cov: 11732 ft: 14286 corp: 18/482b lim: 35 exec/s: 35 rss: 70Mb L: 33/35 MS: 1 CrossOver- 00:10:01.985 [2024-07-24 13:22:20.622115] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.985 [2024-07-24 13:22:20.622151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.985 [2024-07-24 13:22:20.622230] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.985 [2024-07-24 13:22:20.622250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.985 [2024-07-24 13:22:20.622323] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.985 [2024-07-24 13:22:20.622343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:01.985 #36 NEW cov: 11732 ft: 14306 corp: 19/511b lim: 35 exec/s: 36 rss: 70Mb L: 29/35 MS: 1 EraseBytes- 00:10:01.985 [2024-07-24 13:22:20.682138] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.985 [2024-07-24 13:22:20.682173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:01.985 [2024-07-24 13:22:20.682251] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.985 [2024-07-24 13:22:20.682272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.985 [2024-07-24 13:22:20.682346] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.985 [2024-07-24 13:22:20.682365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.985 #37 NEW cov: 11732 ft: 14370 corp: 20/533b lim: 35 exec/s: 37 rss: 70Mb L: 22/35 MS: 1 CopyPart- 00:10:01.985 [2024-07-24 13:22:20.742446] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.985 [2024-07-24 13:22:20.742480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:01.985 [2024-07-24 13:22:20.742552] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.985 [2024-07-24 13:22:20.742572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.985 [2024-07-24 13:22:20.742642] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.985 [2024-07-24 13:22:20.742661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.985 [2024-07-24 13:22:20.742728] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.985 [2024-07-24 13:22:20.742748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:01.985 #38 NEW cov: 11732 ft: 14469 corp: 21/565b lim: 35 exec/s: 38 rss: 70Mb L: 32/35 MS: 1 CMP- DE: "fE\314-\240\347-\000"- 00:10:01.985 [2024-07-24 13:22:20.802705] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.985 [2024-07-24 13:22:20.802740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.986 [2024-07-24 13:22:20.802814] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.986 [2024-07-24 13:22:20.802834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.986 [2024-07-24 13:22:20.802901] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.986 [2024-07-24 13:22:20.802925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:01.986 #39 NEW cov: 11732 ft: 14486 corp: 22/597b lim: 35 exec/s: 39 rss: 70Mb L: 32/35 MS: 1 ChangeBinInt- 00:10:02.245 [2024-07-24 13:22:20.852819] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.245 [2024-07-24 13:22:20.852854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:02.245 [2024-07-24 13:22:20.852927] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.245 [2024-07-24 13:22:20.852947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:02.245 [2024-07-24 13:22:20.853019] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.245 [2024-07-24 13:22:20.853038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:02.245 #40 NEW cov: 11732 ft: 14492 corp: 23/628b lim: 35 exec/s: 40 rss: 70Mb L: 31/35 MS: 1 EraseBytes- 00:10:02.245 [2024-07-24 13:22:20.912759] ctrlr.c:1772:nvmf_ctrlr_get_features_reservation_persistence: *ERROR*: Get Features - Invalid Namespace ID 00:10:02.245 [2024-07-24 13:22:20.913205] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.245 [2024-07-24 13:22:20.913246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:02.245 [2024-07-24 13:22:20.913317] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.245 [2024-07-24 13:22:20.913338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:02.245 [2024-07-24 13:22:20.913406] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST RESERVE PERSIST cid:7 cdw10:00000183 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.245 [2024-07-24 13:22:20.913427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:02.245 [2024-07-24 13:22:20.913493] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.245 [2024-07-24 13:22:20.913514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:10:02.245 NEW_FUNC[1/1]: 0x1163ca0 in nvmf_ctrlr_get_features_reservation_persistence /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1761 00:10:02.245 #41 NEW cov: 11758 ft: 14531 corp: 24/663b lim: 35 exec/s: 41 rss: 70Mb L: 35/35 MS: 1 CrossOver- 00:10:02.245 [2024-07-24 13:22:20.983222] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.245 [2024-07-24 13:22:20.983257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:02.245 [2024-07-24 13:22:20.983330] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.245 [2024-07-24 13:22:20.983351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:02.245 [2024-07-24 13:22:20.983423] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.245 [2024-07-24 13:22:20.983443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:02.245 #42 NEW cov: 11758 ft: 14544 corp: 25/695b lim: 35 exec/s: 42 rss: 70Mb L: 32/35 MS: 1 ShuffleBytes- 00:10:02.245 [2024-07-24 13:22:21.043375] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000134 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.245 [2024-07-24 13:22:21.043415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:02.245 [2024-07-24 13:22:21.043487] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000164 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.245 [2024-07-24 13:22:21.043508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:02.245 [2024-07-24 13:22:21.043580] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.245 [2024-07-24 13:22:21.043598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:02.245 #43 NEW cov: 11758 ft: 14552 corp: 26/725b lim: 35 exec/s: 43 rss: 70Mb L: 30/35 MS: 1 ChangeBit- 00:10:02.245 [2024-07-24 13:22:21.103562] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.245 [2024-07-24 13:22:21.103598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:02.245 [2024-07-24 13:22:21.103671] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.245 [2024-07-24 13:22:21.103693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:02.245 [2024-07-24 13:22:21.103762] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.245 [2024-07-24 13:22:21.103782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:02.505 #44 NEW cov: 11758 ft: 14556 corp: 27/758b lim: 35 exec/s: 44 rss: 71Mb L: 33/35 MS: 1 ChangeByte- 00:10:02.505 [2024-07-24 13:22:21.163888] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.505 [2024-07-24 13:22:21.163923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:02.505 [2024-07-24 13:22:21.163996] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.505 [2024-07-24 13:22:21.164018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:02.505 [2024-07-24 13:22:21.164085] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.505 [2024-07-24 13:22:21.164104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:02.505 [2024-07-24 13:22:21.164176] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.505 [2024-07-24 13:22:21.164197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:10:02.505 #45 NEW cov: 11758 ft: 14561 corp: 28/793b lim: 35 exec/s: 45 rss: 71Mb L: 35/35 MS: 1 ChangeByte- 00:10:02.505 [2024-07-24 13:22:21.224016] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.505 [2024-07-24 13:22:21.224051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:02.505 [2024-07-24 13:22:21.224122] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.505 [2024-07-24 13:22:21.224144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:02.505 [2024-07-24 13:22:21.224218] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.505 [2024-07-24 13:22:21.224238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:02.505 [2024-07-24 13:22:21.224306] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000124 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.505 [2024-07-24 13:22:21.224325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:10:02.505 #46 NEW cov: 11758 ft: 14572 corp: 29/828b lim: 35 exec/s: 23 rss: 71Mb L: 35/35 MS: 1 CrossOver- 00:10:02.505 #46 DONE cov: 11758 ft: 14572 corp: 29/828b lim: 35 exec/s: 23 rss: 71Mb 00:10:02.505 ###### Recommended dictionary. ###### 00:10:02.505 "\002\000" # Uses: 0 00:10:02.505 "\000\000\000\000\000\000\000H" # Uses: 0 00:10:02.505 "fE\314-\240\347-\000" # Uses: 0 00:10:02.505 ###### End of recommended dictionary. ###### 00:10:02.505 Done 46 runs in 2 second(s) 00:10:02.764 13:22:21 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:10:02.764 13:22:21 -- ../common.sh@72 -- # (( i++ )) 00:10:02.764 13:22:21 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:02.764 13:22:21 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:10:02.764 13:22:21 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:10:02.764 13:22:21 -- nvmf/run.sh@24 -- # local timen=1 00:10:02.764 13:22:21 -- nvmf/run.sh@25 -- # local core=0x1 00:10:02.764 13:22:21 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:10:02.764 13:22:21 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:10:02.764 13:22:21 -- nvmf/run.sh@29 -- # printf %02d 16 00:10:02.764 13:22:21 -- nvmf/run.sh@29 -- # port=4416 00:10:02.764 13:22:21 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:10:02.764 13:22:21 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:10:02.764 13:22:21 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:10:02.764 13:22:21 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:10:02.764 [2024-07-24 13:22:21.422642] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:10:02.764 [2024-07-24 13:22:21.422740] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3172851 ] 00:10:02.764 EAL: No free 2048 kB hugepages reported on node 1 00:10:03.024 [2024-07-24 13:22:21.678816] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:03.024 [2024-07-24 13:22:21.705278] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:03.024 [2024-07-24 13:22:21.705459] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:03.024 [2024-07-24 13:22:21.760009] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:10:03.024 [2024-07-24 13:22:21.776279] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:10:03.024 INFO: Running with entropic power schedule (0xFF, 100). 00:10:03.024 INFO: Seed: 3736688160 00:10:03.024 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:10:03.024 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:10:03.024 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:10:03.024 INFO: A corpus is not provided, starting from an empty corpus 00:10:03.024 #2 INITED exec/s: 0 rss: 61Mb 00:10:03.024 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:03.024 This may also happen if the target rejected all inputs we tried so far 00:10:03.024 [2024-07-24 13:22:21.853276] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.024 [2024-07-24 13:22:21.853329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.542 NEW_FUNC[1/671]: 0x4b5130 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:10:03.542 NEW_FUNC[2/671]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:10:03.542 #4 NEW cov: 11594 ft: 11595 corp: 2/38b lim: 105 exec/s: 0 rss: 68Mb L: 37/37 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:10:03.542 [2024-07-24 13:22:22.323022] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.542 [2024-07-24 13:22:22.323087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.542 #5 NEW cov: 11707 ft: 12308 corp: 3/64b lim: 105 exec/s: 0 rss: 68Mb L: 26/37 MS: 1 EraseBytes- 00:10:03.542 [2024-07-24 13:22:22.383001] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.542 [2024-07-24 13:22:22.383039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.801 #11 NEW cov: 11713 ft: 12530 corp: 4/101b lim: 105 exec/s: 0 rss: 68Mb L: 37/37 MS: 1 ChangeBinInt- 00:10:03.801 [2024-07-24 13:22:22.433163] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.801 [2024-07-24 13:22:22.433201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.801 #12 NEW cov: 11798 ft: 12771 corp: 5/138b lim: 105 exec/s: 0 rss: 68Mb L: 37/37 MS: 1 ChangeBit- 00:10:03.801 [2024-07-24 13:22:22.483273] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.801 [2024-07-24 13:22:22.483311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.801 #13 NEW cov: 11798 ft: 12878 corp: 6/160b lim: 105 exec/s: 0 rss: 69Mb L: 22/37 MS: 1 EraseBytes- 00:10:03.801 [2024-07-24 13:22:22.543446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.801 [2024-07-24 13:22:22.543483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.801 #16 NEW cov: 11798 ft: 12986 corp: 7/181b lim: 105 exec/s: 0 rss: 69Mb L: 21/37 MS: 3 ShuffleBytes-ChangeBinInt-CrossOver- 00:10:03.801 [2024-07-24 13:22:22.593635] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071562067967 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.801 [2024-07-24 13:22:22.593671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.801 #17 NEW cov: 11798 ft: 13079 corp: 8/202b lim: 105 exec/s: 0 rss: 69Mb L: 21/37 MS: 1 ChangeBit- 00:10:03.801 [2024-07-24 13:22:22.653766] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.801 [2024-07-24 13:22:22.653802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:04.060 #18 NEW cov: 11798 ft: 13169 corp: 9/239b lim: 105 exec/s: 0 rss: 69Mb L: 37/37 MS: 1 ChangeByte- 00:10:04.060 [2024-07-24 13:22:22.713925] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.060 [2024-07-24 13:22:22.713966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:04.060 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:04.060 #19 NEW cov: 11821 ft: 13250 corp: 10/260b lim: 105 exec/s: 0 rss: 69Mb L: 21/37 MS: 1 ShuffleBytes- 00:10:04.060 [2024-07-24 13:22:22.764110] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.060 [2024-07-24 13:22:22.764146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:04.060 #20 NEW cov: 11821 ft: 13315 corp: 11/297b lim: 105 exec/s: 0 rss: 69Mb L: 37/37 MS: 1 ShuffleBytes- 00:10:04.060 [2024-07-24 13:22:22.814282] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.060 [2024-07-24 13:22:22.814319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:04.060 #21 NEW cov: 11821 ft: 13360 corp: 12/318b lim: 105 exec/s: 21 rss: 69Mb L: 21/37 MS: 1 ShuffleBytes- 00:10:04.060 [2024-07-24 13:22:22.874447] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:360569445166350335 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.060 [2024-07-24 13:22:22.874484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:04.060 #22 NEW cov: 11821 ft: 13380 corp: 13/339b lim: 105 exec/s: 22 rss: 69Mb L: 21/37 MS: 1 ChangeBinInt- 00:10:04.060 [2024-07-24 13:22:22.924574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.060 [2024-07-24 13:22:22.924611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:04.319 #23 NEW cov: 11821 ft: 13390 corp: 14/360b lim: 105 exec/s: 23 rss: 69Mb L: 21/37 MS: 1 ShuffleBytes- 00:10:04.319 [2024-07-24 13:22:22.964656] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071562067967 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.319 [2024-07-24 13:22:22.964693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:04.319 #24 NEW cov: 11821 ft: 13407 corp: 15/395b lim: 105 exec/s: 24 rss: 69Mb L: 35/37 MS: 1 InsertRepeatedBytes- 00:10:04.319 [2024-07-24 13:22:23.024829] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.319 [2024-07-24 13:22:23.024866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:04.319 #25 NEW cov: 11821 ft: 13431 corp: 16/417b lim: 105 exec/s: 25 rss: 69Mb L: 22/37 MS: 1 InsertByte- 00:10:04.319 [2024-07-24 13:22:23.065019] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.319 [2024-07-24 13:22:23.065056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:04.319 #26 NEW cov: 11821 ft: 13480 corp: 17/450b lim: 105 exec/s: 26 rss: 69Mb L: 33/37 MS: 1 CopyPart- 00:10:04.319 [2024-07-24 13:22:23.125164] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:360569445166350335 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.319 [2024-07-24 13:22:23.125203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:04.319 #27 NEW cov: 11821 ft: 13508 corp: 18/491b lim: 105 exec/s: 27 rss: 69Mb L: 41/41 MS: 1 CrossOver- 00:10:04.578 [2024-07-24 13:22:23.185645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446596163625811967 len:31098 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.578 [2024-07-24 13:22:23.185687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:04.578 [2024-07-24 13:22:23.185742] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.578 [2024-07-24 13:22:23.185767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:04.578 [2024-07-24 13:22:23.185842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:8753160913407277433 len:31098 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.578 [2024-07-24 13:22:23.185865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:04.578 #28 NEW cov: 11821 ft: 14009 corp: 19/565b lim: 105 exec/s: 28 rss: 70Mb L: 74/74 MS: 1 InsertRepeatedBytes- 00:10:04.578 [2024-07-24 13:22:23.245478] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073558556671 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.578 [2024-07-24 13:22:23.245515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:04.578 #29 NEW cov: 11821 ft: 14014 corp: 20/586b lim: 105 exec/s: 29 rss: 70Mb L: 21/74 MS: 1 ChangeByte- 00:10:04.578 [2024-07-24 13:22:23.295880] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446596163625811967 len:31098 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.578 [2024-07-24 13:22:23.295917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:04.578 [2024-07-24 13:22:23.295970] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:8753160913407277433 len:31098 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.578 [2024-07-24 13:22:23.295991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:04.578 [2024-07-24 13:22:23.296057] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:8753160913407277433 len:31098 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.578 [2024-07-24 13:22:23.296080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:04.578 #30 NEW cov: 11821 ft: 14028 corp: 21/660b lim: 105 exec/s: 30 rss: 70Mb L: 74/74 MS: 1 ChangeBit- 00:10:04.578 [2024-07-24 13:22:23.356102] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7306357456645743973 len:15718 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.578 [2024-07-24 13:22:23.356139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:04.578 [2024-07-24 13:22:23.356188] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:14685055086129564619 len:52172 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.578 [2024-07-24 13:22:23.356216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:04.578 [2024-07-24 13:22:23.356281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:14685055086129564619 len:52172 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.578 [2024-07-24 13:22:23.356303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:04.578 #33 NEW cov: 11821 ft: 14073 corp: 22/723b lim: 105 exec/s: 33 rss: 70Mb L: 63/74 MS: 3 InsertRepeatedBytes-InsertByte-InsertRepeatedBytes- 00:10:04.578 [2024-07-24 13:22:23.405923] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073558556671 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.578 [2024-07-24 13:22:23.405963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:04.837 #34 NEW cov: 11821 ft: 14090 corp: 23/750b lim: 105 exec/s: 34 rss: 70Mb L: 27/74 MS: 1 CrossOver- 00:10:04.837 [2024-07-24 13:22:23.466071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.837 [2024-07-24 13:22:23.466108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:04.837 #35 NEW cov: 11821 ft: 14101 corp: 24/775b lim: 105 exec/s: 35 rss: 70Mb L: 25/74 MS: 1 CMP- DE: "\001\000\000\000"- 00:10:04.837 [2024-07-24 13:22:23.506196] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743936270598143 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.837 [2024-07-24 13:22:23.506238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:04.837 #36 NEW cov: 11821 ft: 14108 corp: 25/796b lim: 105 exec/s: 36 rss: 70Mb L: 21/74 MS: 1 ChangeBit- 00:10:04.837 [2024-07-24 13:22:23.566374] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070035341311 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.837 [2024-07-24 13:22:23.566410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:04.837 #37 NEW cov: 11821 ft: 14116 corp: 26/818b lim: 105 exec/s: 37 rss: 70Mb L: 22/74 MS: 1 InsertByte- 00:10:04.837 [2024-07-24 13:22:23.626539] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071562067967 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.837 [2024-07-24 13:22:23.626575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:04.837 #38 NEW cov: 11821 ft: 14133 corp: 27/840b lim: 105 exec/s: 38 rss: 70Mb L: 22/74 MS: 1 InsertByte- 00:10:04.837 [2024-07-24 13:22:23.666640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:04.837 [2024-07-24 13:22:23.666677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.095 #39 NEW cov: 11821 ft: 14139 corp: 28/866b lim: 105 exec/s: 39 rss: 70Mb L: 26/74 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:10:05.095 [2024-07-24 13:22:23.726862] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18437736874454810623 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:05.096 [2024-07-24 13:22:23.726899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.096 #40 NEW cov: 11821 ft: 14140 corp: 29/887b lim: 105 exec/s: 40 rss: 70Mb L: 21/74 MS: 1 ShuffleBytes- 00:10:05.096 [2024-07-24 13:22:23.777389] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:10489325059699610001 len:37266 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:05.096 [2024-07-24 13:22:23.777426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.096 [2024-07-24 13:22:23.777487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:10489325061521117585 len:37266 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:05.096 [2024-07-24 13:22:23.777508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:05.096 [2024-07-24 13:22:23.777575] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:10489325061521117585 len:37266 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:05.096 [2024-07-24 13:22:23.777596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:05.096 [2024-07-24 13:22:23.777665] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:10489325061521117585 len:37266 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:05.096 [2024-07-24 13:22:23.777688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:05.096 #42 NEW cov: 11821 ft: 14619 corp: 30/974b lim: 105 exec/s: 42 rss: 70Mb L: 87/87 MS: 2 EraseBytes-InsertRepeatedBytes- 00:10:05.096 [2024-07-24 13:22:23.837114] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071562067967 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:05.096 [2024-07-24 13:22:23.837153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.096 #43 NEW cov: 11821 ft: 14627 corp: 31/1009b lim: 105 exec/s: 21 rss: 70Mb L: 35/87 MS: 1 ChangeBit- 00:10:05.096 #43 DONE cov: 11821 ft: 14627 corp: 31/1009b lim: 105 exec/s: 21 rss: 70Mb 00:10:05.096 ###### Recommended dictionary. ###### 00:10:05.096 "\001\000\000\000" # Uses: 1 00:10:05.096 ###### End of recommended dictionary. ###### 00:10:05.096 Done 43 runs in 2 second(s) 00:10:05.355 13:22:24 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:10:05.355 13:22:24 -- ../common.sh@72 -- # (( i++ )) 00:10:05.355 13:22:24 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:05.355 13:22:24 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:10:05.355 13:22:24 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:10:05.355 13:22:24 -- nvmf/run.sh@24 -- # local timen=1 00:10:05.355 13:22:24 -- nvmf/run.sh@25 -- # local core=0x1 00:10:05.355 13:22:24 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:10:05.355 13:22:24 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:10:05.355 13:22:24 -- nvmf/run.sh@29 -- # printf %02d 17 00:10:05.355 13:22:24 -- nvmf/run.sh@29 -- # port=4417 00:10:05.355 13:22:24 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:10:05.355 13:22:24 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:10:05.355 13:22:24 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:10:05.355 13:22:24 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:10:05.355 [2024-07-24 13:22:24.052929] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:10:05.355 [2024-07-24 13:22:24.053020] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3173221 ] 00:10:05.355 EAL: No free 2048 kB hugepages reported on node 1 00:10:05.615 [2024-07-24 13:22:24.292997] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:05.615 [2024-07-24 13:22:24.319183] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:05.615 [2024-07-24 13:22:24.319366] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:05.615 [2024-07-24 13:22:24.373958] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:10:05.615 [2024-07-24 13:22:24.390197] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:10:05.615 INFO: Running with entropic power schedule (0xFF, 100). 00:10:05.615 INFO: Seed: 2057712268 00:10:05.615 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:10:05.615 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:10:05.615 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:10:05.615 INFO: A corpus is not provided, starting from an empty corpus 00:10:05.615 #2 INITED exec/s: 0 rss: 61Mb 00:10:05.615 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:05.615 This may also happen if the target rejected all inputs we tried so far 00:10:05.615 [2024-07-24 13:22:24.445788] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568454872 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.615 [2024-07-24 13:22:24.445829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.615 [2024-07-24 13:22:24.445894] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15625477333024561368 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.615 [2024-07-24 13:22:24.445916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.133 NEW_FUNC[1/672]: 0x4b8420 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:10:06.133 NEW_FUNC[2/672]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:10:06.133 #3 NEW cov: 11615 ft: 11616 corp: 2/53b lim: 120 exec/s: 0 rss: 68Mb L: 52/52 MS: 1 InsertRepeatedBytes- 00:10:06.133 [2024-07-24 13:22:24.776455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.133 [2024-07-24 13:22:24.776504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.133 #4 NEW cov: 11728 ft: 13002 corp: 3/86b lim: 120 exec/s: 0 rss: 68Mb L: 33/52 MS: 1 InsertRepeatedBytes- 00:10:06.133 [2024-07-24 13:22:24.836878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.133 [2024-07-24 13:22:24.836919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.133 [2024-07-24 13:22:24.836965] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.133 [2024-07-24 13:22:24.836988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.133 [2024-07-24 13:22:24.837055] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.133 [2024-07-24 13:22:24.837079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:06.133 #5 NEW cov: 11734 ft: 13540 corp: 4/166b lim: 120 exec/s: 0 rss: 69Mb L: 80/80 MS: 1 InsertRepeatedBytes- 00:10:06.133 [2024-07-24 13:22:24.897106] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:176553984 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.133 [2024-07-24 13:22:24.897145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.133 [2024-07-24 13:22:24.897192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.133 [2024-07-24 13:22:24.897218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.133 [2024-07-24 13:22:24.897284] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.133 [2024-07-24 13:22:24.897306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:06.133 #6 NEW cov: 11819 ft: 13826 corp: 5/246b lim: 120 exec/s: 0 rss: 69Mb L: 80/80 MS: 1 ChangeByte- 00:10:06.133 [2024-07-24 13:22:24.957102] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568454872 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.133 [2024-07-24 13:22:24.957145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.133 [2024-07-24 13:22:24.957218] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15625477350204430552 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.133 [2024-07-24 13:22:24.957242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.133 #7 NEW cov: 11819 ft: 13924 corp: 6/298b lim: 120 exec/s: 0 rss: 69Mb L: 52/80 MS: 1 ChangeBinInt- 00:10:06.392 [2024-07-24 13:22:25.017181] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568454872 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.392 [2024-07-24 13:22:25.017230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.392 [2024-07-24 13:22:25.017281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15625477350204430552 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.392 [2024-07-24 13:22:25.017305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.392 #8 NEW cov: 11819 ft: 14022 corp: 7/351b lim: 120 exec/s: 0 rss: 69Mb L: 53/80 MS: 1 InsertByte- 00:10:06.392 [2024-07-24 13:22:25.077461] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568454872 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.392 [2024-07-24 13:22:25.077501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.392 [2024-07-24 13:22:25.077549] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15913707709176273112 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.392 [2024-07-24 13:22:25.077573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.392 #9 NEW cov: 11819 ft: 14114 corp: 8/399b lim: 120 exec/s: 0 rss: 69Mb L: 48/80 MS: 1 CrossOver- 00:10:06.392 [2024-07-24 13:22:25.127610] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568454872 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.392 [2024-07-24 13:22:25.127648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.392 [2024-07-24 13:22:25.127707] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15625477350204430552 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.392 [2024-07-24 13:22:25.127730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.392 #10 NEW cov: 11819 ft: 14153 corp: 9/451b lim: 120 exec/s: 0 rss: 69Mb L: 52/80 MS: 1 ChangeBinInt- 00:10:06.392 [2024-07-24 13:22:25.178075] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568454872 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.392 [2024-07-24 13:22:25.178113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.393 [2024-07-24 13:22:25.178161] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15625477333024561368 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.393 [2024-07-24 13:22:25.178186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.393 [2024-07-24 13:22:25.178250] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:15625477333024561368 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.393 [2024-07-24 13:22:25.178273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:06.393 [2024-07-24 13:22:25.178338] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:15626603232931403992 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.393 [2024-07-24 13:22:25.178368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:06.393 #11 NEW cov: 11819 ft: 14525 corp: 10/549b lim: 120 exec/s: 0 rss: 69Mb L: 98/98 MS: 1 CopyPart- 00:10:06.393 [2024-07-24 13:22:25.228206] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568454872 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.393 [2024-07-24 13:22:25.228255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.393 [2024-07-24 13:22:25.228306] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15625477333024561368 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.393 [2024-07-24 13:22:25.228328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.393 [2024-07-24 13:22:25.228392] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:15625477333024561368 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.393 [2024-07-24 13:22:25.228414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:06.393 [2024-07-24 13:22:25.228479] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:15626603232931403992 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.393 [2024-07-24 13:22:25.228502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:06.652 #12 NEW cov: 11819 ft: 14568 corp: 11/647b lim: 120 exec/s: 0 rss: 69Mb L: 98/98 MS: 1 ChangeBinInt- 00:10:06.652 [2024-07-24 13:22:25.288198] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:176553984 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.652 [2024-07-24 13:22:25.288244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.652 [2024-07-24 13:22:25.288296] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.652 [2024-07-24 13:22:25.288317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.652 [2024-07-24 13:22:25.288383] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.652 [2024-07-24 13:22:25.288406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:06.652 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:06.652 #18 NEW cov: 11842 ft: 14591 corp: 12/727b lim: 120 exec/s: 0 rss: 69Mb L: 80/98 MS: 1 ChangeBit- 00:10:06.652 [2024-07-24 13:22:25.348089] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:176553984 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.652 [2024-07-24 13:22:25.348127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.652 #19 NEW cov: 11842 ft: 14618 corp: 13/774b lim: 120 exec/s: 0 rss: 69Mb L: 47/98 MS: 1 EraseBytes- 00:10:06.652 [2024-07-24 13:22:25.398389] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:176553984 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.652 [2024-07-24 13:22:25.398428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.652 [2024-07-24 13:22:25.398492] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18374686483966590975 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.652 [2024-07-24 13:22:25.398519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.652 #20 NEW cov: 11842 ft: 14622 corp: 14/836b lim: 120 exec/s: 20 rss: 70Mb L: 62/98 MS: 1 CopyPart- 00:10:06.652 [2024-07-24 13:22:25.458500] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568454872 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.652 [2024-07-24 13:22:25.458538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.652 [2024-07-24 13:22:25.458583] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15625477350204430552 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.652 [2024-07-24 13:22:25.458606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.652 #21 NEW cov: 11842 ft: 14645 corp: 15/888b lim: 120 exec/s: 21 rss: 70Mb L: 52/98 MS: 1 ShuffleBytes- 00:10:06.652 [2024-07-24 13:22:25.498643] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568401112 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.652 [2024-07-24 13:22:25.498681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.652 [2024-07-24 13:22:25.498740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15625477333024561368 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.652 [2024-07-24 13:22:25.498763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.911 #22 NEW cov: 11842 ft: 14669 corp: 16/941b lim: 120 exec/s: 22 rss: 70Mb L: 53/98 MS: 1 InsertByte- 00:10:06.911 [2024-07-24 13:22:25.538792] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568454872 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.911 [2024-07-24 13:22:25.538829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.911 [2024-07-24 13:22:25.538871] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15625448745722239192 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.911 [2024-07-24 13:22:25.538894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.911 #23 NEW cov: 11842 ft: 14706 corp: 17/994b lim: 120 exec/s: 23 rss: 70Mb L: 53/98 MS: 1 InsertByte- 00:10:06.911 [2024-07-24 13:22:25.598736] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568401112 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.911 [2024-07-24 13:22:25.598774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.911 #24 NEW cov: 11842 ft: 14722 corp: 18/1026b lim: 120 exec/s: 24 rss: 70Mb L: 32/98 MS: 1 EraseBytes- 00:10:06.912 [2024-07-24 13:22:25.659472] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568454872 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.912 [2024-07-24 13:22:25.659511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.912 [2024-07-24 13:22:25.659568] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15625477333024561368 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.912 [2024-07-24 13:22:25.659592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.912 [2024-07-24 13:22:25.659656] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:15625477333024561368 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.912 [2024-07-24 13:22:25.659677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:06.912 [2024-07-24 13:22:25.659748] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:15626603232931403992 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.912 [2024-07-24 13:22:25.659771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:06.912 #25 NEW cov: 11842 ft: 14774 corp: 19/1124b lim: 120 exec/s: 25 rss: 70Mb L: 98/98 MS: 1 ShuffleBytes- 00:10:06.912 [2024-07-24 13:22:25.709681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568454872 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.912 [2024-07-24 13:22:25.709719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.912 [2024-07-24 13:22:25.709766] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15625477333024561368 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.912 [2024-07-24 13:22:25.709790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.912 [2024-07-24 13:22:25.709856] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:15625477333024561368 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.912 [2024-07-24 13:22:25.709877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:06.912 [2024-07-24 13:22:25.709943] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:15626603232931403992 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.912 [2024-07-24 13:22:25.709965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:06.912 #26 NEW cov: 11842 ft: 14792 corp: 20/1222b lim: 120 exec/s: 26 rss: 70Mb L: 98/98 MS: 1 CopyPart- 00:10:06.912 [2024-07-24 13:22:25.769837] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568454872 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.912 [2024-07-24 13:22:25.769874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.912 [2024-07-24 13:22:25.769932] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15625477333024561368 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.912 [2024-07-24 13:22:25.769956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.912 [2024-07-24 13:22:25.770021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:15626603232931403992 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.912 [2024-07-24 13:22:25.770042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:06.912 [2024-07-24 13:22:25.770106] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:15626603232931403992 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.912 [2024-07-24 13:22:25.770129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:07.171 #27 NEW cov: 11842 ft: 14811 corp: 21/1320b lim: 120 exec/s: 27 rss: 70Mb L: 98/98 MS: 1 CopyPart- 00:10:07.171 [2024-07-24 13:22:25.829463] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568401112 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.171 [2024-07-24 13:22:25.829503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:07.171 #28 NEW cov: 11842 ft: 14823 corp: 22/1352b lim: 120 exec/s: 28 rss: 70Mb L: 32/98 MS: 1 ShuffleBytes- 00:10:07.171 [2024-07-24 13:22:25.890167] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568454872 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.171 [2024-07-24 13:22:25.890217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:07.171 [2024-07-24 13:22:25.890265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15913707709176273112 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.171 [2024-07-24 13:22:25.890289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:07.171 [2024-07-24 13:22:25.890354] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:4412750543122677053 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.171 [2024-07-24 13:22:25.890378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:07.171 [2024-07-24 13:22:25.890443] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:4412750543122677053 len:15678 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.171 [2024-07-24 13:22:25.890464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:07.171 #29 NEW cov: 11842 ft: 14852 corp: 23/1448b lim: 120 exec/s: 29 rss: 70Mb L: 96/98 MS: 1 InsertRepeatedBytes- 00:10:07.171 [2024-07-24 13:22:25.949776] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568454872 len:56537 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.171 [2024-07-24 13:22:25.949814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:07.171 #30 NEW cov: 11842 ft: 14866 corp: 24/1479b lim: 120 exec/s: 30 rss: 70Mb L: 31/98 MS: 1 EraseBytes- 00:10:07.171 [2024-07-24 13:22:25.999917] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568401112 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.171 [2024-07-24 13:22:25.999956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:07.430 #31 NEW cov: 11842 ft: 14959 corp: 25/1511b lim: 120 exec/s: 31 rss: 70Mb L: 32/98 MS: 1 ShuffleBytes- 00:10:07.430 [2024-07-24 13:22:26.060697] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568454872 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.430 [2024-07-24 13:22:26.060736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:07.430 [2024-07-24 13:22:26.060791] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15625477333024561368 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.430 [2024-07-24 13:22:26.060812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:07.430 [2024-07-24 13:22:26.060879] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:15625477333024561368 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.430 [2024-07-24 13:22:26.060902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:07.430 [2024-07-24 13:22:26.060965] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:15626603232931403992 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.430 [2024-07-24 13:22:26.060988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:07.430 #32 NEW cov: 11842 ft: 14961 corp: 26/1609b lim: 120 exec/s: 32 rss: 70Mb L: 98/98 MS: 1 ShuffleBytes- 00:10:07.430 [2024-07-24 13:22:26.110419] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568454872 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.430 [2024-07-24 13:22:26.110458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:07.430 [2024-07-24 13:22:26.110522] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15625477333024561368 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.430 [2024-07-24 13:22:26.110546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:07.430 #33 NEW cov: 11842 ft: 14962 corp: 27/1669b lim: 120 exec/s: 33 rss: 71Mb L: 60/98 MS: 1 CMP- DE: "\001-\347\243\014\036\352b"- 00:10:07.430 [2024-07-24 13:22:26.170990] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568454872 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.430 [2024-07-24 13:22:26.171028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:07.430 [2024-07-24 13:22:26.171083] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.430 [2024-07-24 13:22:26.171104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:07.430 [2024-07-24 13:22:26.171167] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:15625477333024823512 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.430 [2024-07-24 13:22:26.171192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:07.430 [2024-07-24 13:22:26.171257] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:15625477333024561368 len:55513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.430 [2024-07-24 13:22:26.171280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:07.430 #34 NEW cov: 11842 ft: 15002 corp: 28/1776b lim: 120 exec/s: 34 rss: 71Mb L: 107/107 MS: 1 InsertRepeatedBytes- 00:10:07.430 [2024-07-24 13:22:26.220588] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15625477329568454872 len:56322 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.430 [2024-07-24 13:22:26.220625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:07.430 #35 NEW cov: 11842 ft: 15101 corp: 29/1815b lim: 120 exec/s: 35 rss: 71Mb L: 39/107 MS: 1 PersAutoDict- DE: "\001-\347\243\014\036\352b"- 00:10:07.430 [2024-07-24 13:22:26.281114] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.430 [2024-07-24 13:22:26.281153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:07.430 [2024-07-24 13:22:26.281199] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.430 [2024-07-24 13:22:26.281227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:07.430 [2024-07-24 13:22:26.281295] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.430 [2024-07-24 13:22:26.281318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:07.690 #36 NEW cov: 11842 ft: 15108 corp: 30/1895b lim: 120 exec/s: 36 rss: 71Mb L: 80/107 MS: 1 CrossOver- 00:10:07.690 [2024-07-24 13:22:26.330855] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:101318656 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.690 [2024-07-24 13:22:26.330892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:07.690 #39 NEW cov: 11842 ft: 15154 corp: 31/1929b lim: 120 exec/s: 39 rss: 71Mb L: 34/107 MS: 3 ChangeBit-ChangeBinInt-CrossOver- 00:10:07.690 [2024-07-24 13:22:26.381038] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.690 [2024-07-24 13:22:26.381077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:07.690 #41 NEW cov: 11842 ft: 15166 corp: 32/1962b lim: 120 exec/s: 41 rss: 71Mb L: 33/107 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:10:07.690 [2024-07-24 13:22:26.431204] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:176553984 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:07.690 [2024-07-24 13:22:26.431246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:07.690 #42 NEW cov: 11842 ft: 15180 corp: 33/2009b lim: 120 exec/s: 21 rss: 71Mb L: 47/107 MS: 1 CMP- DE: "\001-\347\2437yn\202"- 00:10:07.690 #42 DONE cov: 11842 ft: 15180 corp: 33/2009b lim: 120 exec/s: 21 rss: 71Mb 00:10:07.690 ###### Recommended dictionary. ###### 00:10:07.690 "\001-\347\243\014\036\352b" # Uses: 1 00:10:07.690 "\001-\347\2437yn\202" # Uses: 0 00:10:07.690 ###### End of recommended dictionary. ###### 00:10:07.690 Done 42 runs in 2 second(s) 00:10:07.950 13:22:26 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:10:07.950 13:22:26 -- ../common.sh@72 -- # (( i++ )) 00:10:07.950 13:22:26 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:07.950 13:22:26 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:10:07.950 13:22:26 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:10:07.950 13:22:26 -- nvmf/run.sh@24 -- # local timen=1 00:10:07.950 13:22:26 -- nvmf/run.sh@25 -- # local core=0x1 00:10:07.950 13:22:26 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:10:07.950 13:22:26 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:10:07.950 13:22:26 -- nvmf/run.sh@29 -- # printf %02d 18 00:10:07.950 13:22:26 -- nvmf/run.sh@29 -- # port=4418 00:10:07.950 13:22:26 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:10:07.950 13:22:26 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:10:07.950 13:22:26 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:10:07.950 13:22:26 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:10:07.950 [2024-07-24 13:22:26.644961] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:10:07.950 [2024-07-24 13:22:26.645046] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3173579 ] 00:10:07.950 EAL: No free 2048 kB hugepages reported on node 1 00:10:08.210 [2024-07-24 13:22:26.885530] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:08.210 [2024-07-24 13:22:26.911659] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:08.210 [2024-07-24 13:22:26.911830] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:08.210 [2024-07-24 13:22:26.966347] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:10:08.210 [2024-07-24 13:22:26.982576] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:10:08.210 INFO: Running with entropic power schedule (0xFF, 100). 00:10:08.210 INFO: Seed: 352736025 00:10:08.210 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:10:08.210 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:10:08.210 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:10:08.210 INFO: A corpus is not provided, starting from an empty corpus 00:10:08.210 #2 INITED exec/s: 0 rss: 61Mb 00:10:08.210 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:08.210 This may also happen if the target rejected all inputs we tried so far 00:10:08.210 [2024-07-24 13:22:27.031669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.210 [2024-07-24 13:22:27.031710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.778 NEW_FUNC[1/670]: 0x4bbc80 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:10:08.778 NEW_FUNC[2/670]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:10:08.778 #27 NEW cov: 11551 ft: 11560 corp: 2/27b lim: 100 exec/s: 0 rss: 68Mb L: 26/26 MS: 5 InsertRepeatedBytes-ChangeByte-ChangeBit-ShuffleBytes-CopyPart- 00:10:08.778 [2024-07-24 13:22:27.503284] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.778 [2024-07-24 13:22:27.503334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.778 [2024-07-24 13:22:27.503385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:08.778 [2024-07-24 13:22:27.503406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.778 [2024-07-24 13:22:27.503467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:08.778 [2024-07-24 13:22:27.503488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:08.778 [2024-07-24 13:22:27.503549] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:08.778 [2024-07-24 13:22:27.503568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:08.778 #43 NEW cov: 11672 ft: 12419 corp: 3/115b lim: 100 exec/s: 0 rss: 68Mb L: 88/88 MS: 1 InsertRepeatedBytes- 00:10:08.779 [2024-07-24 13:22:27.553223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.779 [2024-07-24 13:22:27.553259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.779 [2024-07-24 13:22:27.553317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:08.779 [2024-07-24 13:22:27.553338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.779 [2024-07-24 13:22:27.553399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:08.779 [2024-07-24 13:22:27.553419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:08.779 [2024-07-24 13:22:27.553480] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:08.779 [2024-07-24 13:22:27.553501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:08.779 #44 NEW cov: 11678 ft: 12622 corp: 4/204b lim: 100 exec/s: 0 rss: 68Mb L: 89/89 MS: 1 InsertByte- 00:10:08.779 [2024-07-24 13:22:27.613473] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.779 [2024-07-24 13:22:27.613509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.779 [2024-07-24 13:22:27.613560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:08.779 [2024-07-24 13:22:27.613581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.779 [2024-07-24 13:22:27.613643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:08.779 [2024-07-24 13:22:27.613667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:08.779 [2024-07-24 13:22:27.613729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:08.779 [2024-07-24 13:22:27.613750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:09.039 #45 NEW cov: 11763 ft: 12929 corp: 5/293b lim: 100 exec/s: 0 rss: 69Mb L: 89/89 MS: 1 ChangeByte- 00:10:09.039 [2024-07-24 13:22:27.673617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.039 [2024-07-24 13:22:27.673652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.039 [2024-07-24 13:22:27.673710] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.039 [2024-07-24 13:22:27.673732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.039 [2024-07-24 13:22:27.673794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:09.039 [2024-07-24 13:22:27.673814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:09.039 [2024-07-24 13:22:27.673876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:09.039 [2024-07-24 13:22:27.673897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:09.039 #46 NEW cov: 11763 ft: 13013 corp: 6/381b lim: 100 exec/s: 0 rss: 69Mb L: 88/89 MS: 1 ShuffleBytes- 00:10:09.039 [2024-07-24 13:22:27.723525] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.039 [2024-07-24 13:22:27.723561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.039 [2024-07-24 13:22:27.723621] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.039 [2024-07-24 13:22:27.723642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.039 #47 NEW cov: 11763 ft: 13412 corp: 7/429b lim: 100 exec/s: 0 rss: 69Mb L: 48/89 MS: 1 EraseBytes- 00:10:09.039 [2024-07-24 13:22:27.773610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.039 [2024-07-24 13:22:27.773646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.039 [2024-07-24 13:22:27.773687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.039 [2024-07-24 13:22:27.773708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.039 #48 NEW cov: 11763 ft: 13503 corp: 8/477b lim: 100 exec/s: 0 rss: 69Mb L: 48/89 MS: 1 ChangeBinInt- 00:10:09.039 [2024-07-24 13:22:27.834068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.039 [2024-07-24 13:22:27.834104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.039 [2024-07-24 13:22:27.834163] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.039 [2024-07-24 13:22:27.834184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.039 [2024-07-24 13:22:27.834250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:09.039 [2024-07-24 13:22:27.834271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:09.039 [2024-07-24 13:22:27.834335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:09.039 [2024-07-24 13:22:27.834360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:09.039 #49 NEW cov: 11763 ft: 13534 corp: 9/573b lim: 100 exec/s: 0 rss: 69Mb L: 96/96 MS: 1 CMP- DE: "\377\377\377\377\377\377\003\000"- 00:10:09.039 [2024-07-24 13:22:27.894233] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.039 [2024-07-24 13:22:27.894269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.039 [2024-07-24 13:22:27.894330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.039 [2024-07-24 13:22:27.894351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.039 [2024-07-24 13:22:27.894412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:09.039 [2024-07-24 13:22:27.894433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:09.039 [2024-07-24 13:22:27.894494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:09.039 [2024-07-24 13:22:27.894514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:09.298 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:09.298 #50 NEW cov: 11786 ft: 13602 corp: 10/661b lim: 100 exec/s: 0 rss: 69Mb L: 88/96 MS: 1 ShuffleBytes- 00:10:09.298 [2024-07-24 13:22:27.944395] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.298 [2024-07-24 13:22:27.944430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.298 [2024-07-24 13:22:27.944491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.298 [2024-07-24 13:22:27.944512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.298 [2024-07-24 13:22:27.944574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:09.298 [2024-07-24 13:22:27.944594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:09.298 [2024-07-24 13:22:27.944655] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:09.298 [2024-07-24 13:22:27.944677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:09.298 #51 NEW cov: 11786 ft: 13632 corp: 11/750b lim: 100 exec/s: 0 rss: 69Mb L: 89/96 MS: 1 InsertByte- 00:10:09.298 [2024-07-24 13:22:27.994558] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.298 [2024-07-24 13:22:27.994593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.298 [2024-07-24 13:22:27.994641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.298 [2024-07-24 13:22:27.994662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.298 [2024-07-24 13:22:27.994722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:09.299 [2024-07-24 13:22:27.994743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:09.299 [2024-07-24 13:22:27.994804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:09.299 [2024-07-24 13:22:27.994824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:09.299 #52 NEW cov: 11786 ft: 13651 corp: 12/839b lim: 100 exec/s: 52 rss: 69Mb L: 89/96 MS: 1 ChangeBit- 00:10:09.299 [2024-07-24 13:22:28.044665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.299 [2024-07-24 13:22:28.044701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.299 [2024-07-24 13:22:28.044759] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.299 [2024-07-24 13:22:28.044780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.299 [2024-07-24 13:22:28.044840] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:09.299 [2024-07-24 13:22:28.044861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:09.299 [2024-07-24 13:22:28.044922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:09.299 [2024-07-24 13:22:28.044943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:09.299 #53 NEW cov: 11786 ft: 13700 corp: 13/928b lim: 100 exec/s: 53 rss: 69Mb L: 89/96 MS: 1 ChangeBit- 00:10:09.299 [2024-07-24 13:22:28.104859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.299 [2024-07-24 13:22:28.104895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.299 [2024-07-24 13:22:28.104941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.299 [2024-07-24 13:22:28.104961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.299 [2024-07-24 13:22:28.105021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:09.299 [2024-07-24 13:22:28.105042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:09.299 [2024-07-24 13:22:28.105105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:09.299 [2024-07-24 13:22:28.105125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:09.299 #54 NEW cov: 11786 ft: 13712 corp: 14/1017b lim: 100 exec/s: 54 rss: 69Mb L: 89/96 MS: 1 CopyPart- 00:10:09.559 [2024-07-24 13:22:28.164787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.559 [2024-07-24 13:22:28.164823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.559 [2024-07-24 13:22:28.164868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.559 [2024-07-24 13:22:28.164890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.559 #55 NEW cov: 11786 ft: 13752 corp: 15/1066b lim: 100 exec/s: 55 rss: 69Mb L: 49/96 MS: 1 EraseBytes- 00:10:09.559 [2024-07-24 13:22:28.225240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.559 [2024-07-24 13:22:28.225276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.559 [2024-07-24 13:22:28.225335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.559 [2024-07-24 13:22:28.225357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.559 [2024-07-24 13:22:28.225418] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:09.559 [2024-07-24 13:22:28.225438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:09.559 [2024-07-24 13:22:28.225504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:09.559 [2024-07-24 13:22:28.225525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:09.559 #56 NEW cov: 11786 ft: 13772 corp: 16/1162b lim: 100 exec/s: 56 rss: 70Mb L: 96/96 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\003\000"- 00:10:09.559 [2024-07-24 13:22:28.285422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.559 [2024-07-24 13:22:28.285458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.559 [2024-07-24 13:22:28.285506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.559 [2024-07-24 13:22:28.285527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.559 [2024-07-24 13:22:28.285587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:09.559 [2024-07-24 13:22:28.285609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:09.559 [2024-07-24 13:22:28.285672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:09.559 [2024-07-24 13:22:28.285693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:09.559 #57 NEW cov: 11786 ft: 13802 corp: 17/1250b lim: 100 exec/s: 57 rss: 70Mb L: 88/96 MS: 1 ChangeByte- 00:10:09.559 [2024-07-24 13:22:28.335495] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.559 [2024-07-24 13:22:28.335530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.559 [2024-07-24 13:22:28.335579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.559 [2024-07-24 13:22:28.335599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.559 [2024-07-24 13:22:28.335660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:09.559 [2024-07-24 13:22:28.335681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:09.559 [2024-07-24 13:22:28.335744] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:09.559 [2024-07-24 13:22:28.335763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:09.559 #58 NEW cov: 11786 ft: 13815 corp: 18/1339b lim: 100 exec/s: 58 rss: 70Mb L: 89/96 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\003\000"- 00:10:09.559 [2024-07-24 13:22:28.385709] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.559 [2024-07-24 13:22:28.385744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.559 [2024-07-24 13:22:28.385803] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.559 [2024-07-24 13:22:28.385824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.559 [2024-07-24 13:22:28.385887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:09.559 [2024-07-24 13:22:28.385908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:09.559 [2024-07-24 13:22:28.385970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:09.559 [2024-07-24 13:22:28.385991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:09.819 #59 NEW cov: 11786 ft: 13826 corp: 19/1432b lim: 100 exec/s: 59 rss: 70Mb L: 93/96 MS: 1 CopyPart- 00:10:09.819 [2024-07-24 13:22:28.445856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.819 [2024-07-24 13:22:28.445892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.819 [2024-07-24 13:22:28.445950] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.819 [2024-07-24 13:22:28.445971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.819 [2024-07-24 13:22:28.446032] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:09.819 [2024-07-24 13:22:28.446053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:09.819 [2024-07-24 13:22:28.446115] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:09.819 [2024-07-24 13:22:28.446135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:09.819 #60 NEW cov: 11786 ft: 13912 corp: 20/1525b lim: 100 exec/s: 60 rss: 70Mb L: 93/96 MS: 1 ChangeByte- 00:10:09.819 [2024-07-24 13:22:28.506026] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.819 [2024-07-24 13:22:28.506062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.819 [2024-07-24 13:22:28.506120] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.819 [2024-07-24 13:22:28.506142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.819 [2024-07-24 13:22:28.506206] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:09.819 [2024-07-24 13:22:28.506234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:09.819 [2024-07-24 13:22:28.506297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:09.819 [2024-07-24 13:22:28.506318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:09.819 #61 NEW cov: 11786 ft: 13916 corp: 21/1613b lim: 100 exec/s: 61 rss: 70Mb L: 88/96 MS: 1 CrossOver- 00:10:09.819 [2024-07-24 13:22:28.546314] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.819 [2024-07-24 13:22:28.546349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.819 [2024-07-24 13:22:28.546412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.819 [2024-07-24 13:22:28.546434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.819 [2024-07-24 13:22:28.546496] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:09.819 [2024-07-24 13:22:28.546515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:09.819 [2024-07-24 13:22:28.546579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:09.819 [2024-07-24 13:22:28.546599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:09.819 [2024-07-24 13:22:28.546661] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:10:09.819 [2024-07-24 13:22:28.546682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:10:09.819 #62 NEW cov: 11786 ft: 14017 corp: 22/1713b lim: 100 exec/s: 62 rss: 70Mb L: 100/100 MS: 1 InsertRepeatedBytes- 00:10:09.819 [2024-07-24 13:22:28.606364] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.819 [2024-07-24 13:22:28.606400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.819 [2024-07-24 13:22:28.606459] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.819 [2024-07-24 13:22:28.606480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.819 [2024-07-24 13:22:28.606542] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:09.819 [2024-07-24 13:22:28.606564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:09.819 [2024-07-24 13:22:28.606625] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:09.819 [2024-07-24 13:22:28.606645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:09.819 #63 NEW cov: 11786 ft: 14029 corp: 23/1802b lim: 100 exec/s: 63 rss: 70Mb L: 89/100 MS: 1 ChangeByte- 00:10:09.819 [2024-07-24 13:22:28.656434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.819 [2024-07-24 13:22:28.656470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.819 [2024-07-24 13:22:28.656512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.819 [2024-07-24 13:22:28.656532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.819 [2024-07-24 13:22:28.656591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:09.819 [2024-07-24 13:22:28.656612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:09.819 [2024-07-24 13:22:28.656673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:09.819 [2024-07-24 13:22:28.656694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:10.078 #64 NEW cov: 11786 ft: 14036 corp: 24/1894b lim: 100 exec/s: 64 rss: 70Mb L: 92/100 MS: 1 CrossOver- 00:10:10.079 [2024-07-24 13:22:28.716692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:10.079 [2024-07-24 13:22:28.716728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.079 [2024-07-24 13:22:28.716774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:10.079 [2024-07-24 13:22:28.716796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:10.079 [2024-07-24 13:22:28.716856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:10.079 [2024-07-24 13:22:28.716875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:10.079 [2024-07-24 13:22:28.716939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:10.079 [2024-07-24 13:22:28.716961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:10.079 #65 NEW cov: 11786 ft: 14047 corp: 25/1977b lim: 100 exec/s: 65 rss: 70Mb L: 83/100 MS: 1 EraseBytes- 00:10:10.079 [2024-07-24 13:22:28.766852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:10.079 [2024-07-24 13:22:28.766891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.079 [2024-07-24 13:22:28.766935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:10.079 [2024-07-24 13:22:28.766956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:10.079 [2024-07-24 13:22:28.767017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:10.079 [2024-07-24 13:22:28.767038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:10.079 [2024-07-24 13:22:28.767103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:10.079 [2024-07-24 13:22:28.767123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:10.079 #66 NEW cov: 11786 ft: 14058 corp: 26/2070b lim: 100 exec/s: 66 rss: 70Mb L: 93/100 MS: 1 ChangeBinInt- 00:10:10.079 [2024-07-24 13:22:28.816930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:10.079 [2024-07-24 13:22:28.816965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.079 [2024-07-24 13:22:28.817017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:10.079 [2024-07-24 13:22:28.817038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:10.079 [2024-07-24 13:22:28.817102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:10.079 [2024-07-24 13:22:28.817123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:10.079 [2024-07-24 13:22:28.817183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:10.079 [2024-07-24 13:22:28.817204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:10.079 #67 NEW cov: 11786 ft: 14071 corp: 27/2159b lim: 100 exec/s: 67 rss: 70Mb L: 89/100 MS: 1 CopyPart- 00:10:10.079 [2024-07-24 13:22:28.866803] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:10.079 [2024-07-24 13:22:28.866839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.079 [2024-07-24 13:22:28.866889] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:10.079 [2024-07-24 13:22:28.866910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:10.079 #68 NEW cov: 11786 ft: 14092 corp: 28/2208b lim: 100 exec/s: 68 rss: 70Mb L: 49/100 MS: 1 InsertByte- 00:10:10.079 [2024-07-24 13:22:28.927130] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:10.079 [2024-07-24 13:22:28.927166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.079 [2024-07-24 13:22:28.927221] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:10.079 [2024-07-24 13:22:28.927242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:10.079 [2024-07-24 13:22:28.927302] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:10.079 [2024-07-24 13:22:28.927323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:10.338 #74 NEW cov: 11786 ft: 14323 corp: 29/2281b lim: 100 exec/s: 74 rss: 70Mb L: 73/100 MS: 1 EraseBytes- 00:10:10.338 [2024-07-24 13:22:28.987404] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:10.338 [2024-07-24 13:22:28.987443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.338 [2024-07-24 13:22:28.987490] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:10.338 [2024-07-24 13:22:28.987511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:10.338 [2024-07-24 13:22:28.987574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:10.338 [2024-07-24 13:22:28.987595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:10.338 [2024-07-24 13:22:28.987657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:10.338 [2024-07-24 13:22:28.987678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:10.338 #75 NEW cov: 11786 ft: 14339 corp: 30/2377b lim: 100 exec/s: 75 rss: 70Mb L: 96/100 MS: 1 ChangeBinInt- 00:10:10.338 [2024-07-24 13:22:29.037619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:10.338 [2024-07-24 13:22:29.037654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.338 [2024-07-24 13:22:29.037701] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:10.338 [2024-07-24 13:22:29.037722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:10.338 [2024-07-24 13:22:29.037785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:10.338 [2024-07-24 13:22:29.037805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:10.338 [2024-07-24 13:22:29.037871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:10.338 [2024-07-24 13:22:29.037891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:10.339 #76 NEW cov: 11786 ft: 14348 corp: 31/2474b lim: 100 exec/s: 38 rss: 70Mb L: 97/100 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\003\000"- 00:10:10.339 #76 DONE cov: 11786 ft: 14348 corp: 31/2474b lim: 100 exec/s: 38 rss: 70Mb 00:10:10.339 ###### Recommended dictionary. ###### 00:10:10.339 "\377\377\377\377\377\377\003\000" # Uses: 3 00:10:10.339 ###### End of recommended dictionary. ###### 00:10:10.339 Done 76 runs in 2 second(s) 00:10:10.598 13:22:29 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:10:10.598 13:22:29 -- ../common.sh@72 -- # (( i++ )) 00:10:10.598 13:22:29 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:10.598 13:22:29 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:10:10.598 13:22:29 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:10:10.598 13:22:29 -- nvmf/run.sh@24 -- # local timen=1 00:10:10.598 13:22:29 -- nvmf/run.sh@25 -- # local core=0x1 00:10:10.598 13:22:29 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:10:10.598 13:22:29 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:10:10.598 13:22:29 -- nvmf/run.sh@29 -- # printf %02d 19 00:10:10.598 13:22:29 -- nvmf/run.sh@29 -- # port=4419 00:10:10.598 13:22:29 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:10:10.598 13:22:29 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:10:10.598 13:22:29 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:10:10.598 13:22:29 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:10:10.598 [2024-07-24 13:22:29.260250] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:10:10.598 [2024-07-24 13:22:29.260325] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3173945 ] 00:10:10.598 EAL: No free 2048 kB hugepages reported on node 1 00:10:10.857 [2024-07-24 13:22:29.517861] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:10.857 [2024-07-24 13:22:29.544038] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:10.857 [2024-07-24 13:22:29.544216] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:10.857 [2024-07-24 13:22:29.598718] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:10:10.857 [2024-07-24 13:22:29.614942] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:10:10.857 INFO: Running with entropic power schedule (0xFF, 100). 00:10:10.857 INFO: Seed: 2986731928 00:10:10.857 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:10:10.857 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:10:10.857 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:10:10.857 INFO: A corpus is not provided, starting from an empty corpus 00:10:10.857 #2 INITED exec/s: 0 rss: 61Mb 00:10:10.857 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:10.857 This may also happen if the target rejected all inputs we tried so far 00:10:10.857 [2024-07-24 13:22:29.670090] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2653552640 len:1 00:10:10.857 [2024-07-24 13:22:29.670135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.857 [2024-07-24 13:22:29.670184] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:10:10.857 [2024-07-24 13:22:29.670230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:10.857 [2024-07-24 13:22:29.670278] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:10:10.857 [2024-07-24 13:22:29.670303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:11.374 NEW_FUNC[1/670]: 0x4bec40 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:10:11.374 NEW_FUNC[2/670]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:10:11.375 #10 NEW cov: 11537 ft: 11538 corp: 2/34b lim: 50 exec/s: 0 rss: 68Mb L: 33/33 MS: 3 ChangeBit-InsertByte-InsertRepeatedBytes- 00:10:11.375 [2024-07-24 13:22:30.030932] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2653552640 len:1 00:10:11.375 [2024-07-24 13:22:30.030996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.375 [2024-07-24 13:22:30.031049] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:10:11.375 [2024-07-24 13:22:30.031076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:11.375 #11 NEW cov: 11650 ft: 12303 corp: 3/55b lim: 50 exec/s: 0 rss: 68Mb L: 21/33 MS: 1 EraseBytes- 00:10:11.375 [2024-07-24 13:22:30.131149] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17726168135983824896 len:1 00:10:11.375 [2024-07-24 13:22:30.131199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.375 [2024-07-24 13:22:30.131264] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:10:11.375 [2024-07-24 13:22:30.131291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:11.375 [2024-07-24 13:22:30.131336] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:10:11.375 [2024-07-24 13:22:30.131361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:11.375 #12 NEW cov: 11656 ft: 12568 corp: 4/88b lim: 50 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 ChangeBinInt- 00:10:11.375 [2024-07-24 13:22:30.211224] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2653552640 len:1 00:10:11.375 [2024-07-24 13:22:30.211269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.375 [2024-07-24 13:22:30.211319] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:10:11.375 [2024-07-24 13:22:30.211348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:11.633 #13 NEW cov: 11741 ft: 12759 corp: 5/117b lim: 50 exec/s: 0 rss: 68Mb L: 29/33 MS: 1 EraseBytes- 00:10:11.633 [2024-07-24 13:22:30.281509] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17726168135983824939 len:1 00:10:11.633 [2024-07-24 13:22:30.281552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.633 [2024-07-24 13:22:30.281601] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:10:11.633 [2024-07-24 13:22:30.281630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:11.633 [2024-07-24 13:22:30.281675] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:10:11.633 [2024-07-24 13:22:30.281701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:11.633 #14 NEW cov: 11741 ft: 12831 corp: 6/150b lim: 50 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 ChangeByte- 00:10:11.633 [2024-07-24 13:22:30.371649] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2653552640 len:1 00:10:11.633 [2024-07-24 13:22:30.371693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.633 [2024-07-24 13:22:30.371745] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:159 00:10:11.633 [2024-07-24 13:22:30.371775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:11.633 #15 NEW cov: 11741 ft: 12941 corp: 7/171b lim: 50 exec/s: 0 rss: 69Mb L: 21/33 MS: 1 CrossOver- 00:10:11.633 [2024-07-24 13:22:30.461953] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:281477630263296 len:1 00:10:11.633 [2024-07-24 13:22:30.461997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.633 [2024-07-24 13:22:30.462046] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:10:11.633 [2024-07-24 13:22:30.462074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:11.892 #16 NEW cov: 11741 ft: 12981 corp: 8/192b lim: 50 exec/s: 0 rss: 69Mb L: 21/33 MS: 1 ChangeBit- 00:10:11.892 [2024-07-24 13:22:30.562233] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2653552640 len:2 00:10:11.892 [2024-07-24 13:22:30.562276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.892 [2024-07-24 13:22:30.562330] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:10:11.892 [2024-07-24 13:22:30.562358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:11.892 [2024-07-24 13:22:30.562404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:10:11.892 [2024-07-24 13:22:30.562430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:11.892 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:11.892 #17 NEW cov: 11758 ft: 13089 corp: 9/225b lim: 50 exec/s: 0 rss: 69Mb L: 33/33 MS: 1 ChangeBinInt- 00:10:11.892 [2024-07-24 13:22:30.632394] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2653552640 len:1 00:10:11.892 [2024-07-24 13:22:30.632436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.892 [2024-07-24 13:22:30.632486] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4398046511104 len:1 00:10:11.892 [2024-07-24 13:22:30.632515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:11.892 #18 NEW cov: 11758 ft: 13124 corp: 10/246b lim: 50 exec/s: 18 rss: 69Mb L: 21/33 MS: 1 ChangeBit- 00:10:11.892 [2024-07-24 13:22:30.702554] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2653552640 len:1 00:10:11.892 [2024-07-24 13:22:30.702597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.892 [2024-07-24 13:22:30.702647] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:140737488355328 len:159 00:10:11.892 [2024-07-24 13:22:30.702675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:12.150 #19 NEW cov: 11758 ft: 13141 corp: 11/267b lim: 50 exec/s: 19 rss: 69Mb L: 21/33 MS: 1 ChangeBit- 00:10:12.150 [2024-07-24 13:22:30.792830] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:19833421824 len:1 00:10:12.150 [2024-07-24 13:22:30.792872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:12.150 [2024-07-24 13:22:30.792921] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:10:12.150 [2024-07-24 13:22:30.792949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:12.150 #20 NEW cov: 11758 ft: 13164 corp: 12/296b lim: 50 exec/s: 20 rss: 69Mb L: 29/33 MS: 1 ChangeBit- 00:10:12.150 [2024-07-24 13:22:30.863004] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18374686482339201023 len:1 00:10:12.150 [2024-07-24 13:22:30.863049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:12.150 [2024-07-24 13:22:30.863100] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:159 00:10:12.150 [2024-07-24 13:22:30.863128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:12.150 #21 NEW cov: 11758 ft: 13188 corp: 13/317b lim: 50 exec/s: 21 rss: 69Mb L: 21/33 MS: 1 CMP- DE: "\377\377\377\377"- 00:10:12.150 [2024-07-24 13:22:30.933154] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2653552640 len:1 00:10:12.150 [2024-07-24 13:22:30.933196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:12.150 [2024-07-24 13:22:30.933261] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:256 00:10:12.150 [2024-07-24 13:22:30.933291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:12.150 #22 NEW cov: 11758 ft: 13195 corp: 14/338b lim: 50 exec/s: 22 rss: 69Mb L: 21/33 MS: 1 ChangeBinInt- 00:10:12.150 [2024-07-24 13:22:31.003418] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17726168135983824896 len:1 00:10:12.150 [2024-07-24 13:22:31.003460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:12.150 [2024-07-24 13:22:31.003510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:10:12.150 [2024-07-24 13:22:31.003537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:12.409 #23 NEW cov: 11758 ft: 13223 corp: 15/364b lim: 50 exec/s: 23 rss: 69Mb L: 26/33 MS: 1 EraseBytes- 00:10:12.409 [2024-07-24 13:22:31.073576] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2653552640 len:4097 00:10:12.409 [2024-07-24 13:22:31.073619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:12.409 [2024-07-24 13:22:31.073668] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:256 00:10:12.409 [2024-07-24 13:22:31.073695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:12.409 #24 NEW cov: 11758 ft: 13243 corp: 16/385b lim: 50 exec/s: 24 rss: 69Mb L: 21/33 MS: 1 ChangeBit- 00:10:12.409 [2024-07-24 13:22:31.163852] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2653552640 len:1 00:10:12.409 [2024-07-24 13:22:31.163895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:12.409 [2024-07-24 13:22:31.163945] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:140741783322623 len:159 00:10:12.409 [2024-07-24 13:22:31.163973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:12.409 #25 NEW cov: 11758 ft: 13247 corp: 17/406b lim: 50 exec/s: 25 rss: 69Mb L: 21/33 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:10:12.409 [2024-07-24 13:22:31.254099] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2653552640 len:1 00:10:12.409 [2024-07-24 13:22:31.254142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:12.409 [2024-07-24 13:22:31.254192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4294967295 len:1 00:10:12.409 [2024-07-24 13:22:31.254231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:12.668 #26 NEW cov: 11758 ft: 13302 corp: 18/431b lim: 50 exec/s: 26 rss: 69Mb L: 25/33 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:10:12.668 [2024-07-24 13:22:31.324256] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2653592576 len:17 00:10:12.668 [2024-07-24 13:22:31.324301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:12.668 [2024-07-24 13:22:31.324350] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:10:12.668 [2024-07-24 13:22:31.324379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:12.668 #27 NEW cov: 11758 ft: 13313 corp: 19/453b lim: 50 exec/s: 27 rss: 69Mb L: 22/33 MS: 1 InsertByte- 00:10:12.668 [2024-07-24 13:22:31.414471] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:681258385408 len:10753 00:10:12.668 [2024-07-24 13:22:31.414516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:12.668 #28 NEW cov: 11758 ft: 13624 corp: 20/471b lim: 50 exec/s: 28 rss: 69Mb L: 18/33 MS: 1 CrossOver- 00:10:12.668 [2024-07-24 13:22:31.484610] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2653552640 len:1 00:10:12.668 [2024-07-24 13:22:31.484654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:12.927 #29 NEW cov: 11758 ft: 13636 corp: 21/488b lim: 50 exec/s: 29 rss: 69Mb L: 17/33 MS: 1 EraseBytes- 00:10:12.927 [2024-07-24 13:22:31.554920] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2653552640 len:2 00:10:12.927 [2024-07-24 13:22:31.554964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:12.927 [2024-07-24 13:22:31.555013] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:10:12.927 [2024-07-24 13:22:31.555041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:12.927 [2024-07-24 13:22:31.555087] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:10:12.927 [2024-07-24 13:22:31.555112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:12.927 #30 NEW cov: 11765 ft: 13668 corp: 22/521b lim: 50 exec/s: 30 rss: 69Mb L: 33/33 MS: 1 ShuffleBytes- 00:10:12.927 [2024-07-24 13:22:31.645076] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2653552640 len:1 00:10:12.927 [2024-07-24 13:22:31.645120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:12.927 #31 NEW cov: 11765 ft: 13704 corp: 23/538b lim: 50 exec/s: 15 rss: 69Mb L: 17/33 MS: 1 EraseBytes- 00:10:12.927 #31 DONE cov: 11765 ft: 13704 corp: 23/538b lim: 50 exec/s: 15 rss: 69Mb 00:10:12.927 ###### Recommended dictionary. ###### 00:10:12.927 "\377\377\377\377" # Uses: 2 00:10:12.927 ###### End of recommended dictionary. ###### 00:10:12.927 Done 31 runs in 2 second(s) 00:10:13.187 13:22:31 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:10:13.187 13:22:31 -- ../common.sh@72 -- # (( i++ )) 00:10:13.187 13:22:31 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:13.187 13:22:31 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:10:13.187 13:22:31 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:10:13.187 13:22:31 -- nvmf/run.sh@24 -- # local timen=1 00:10:13.187 13:22:31 -- nvmf/run.sh@25 -- # local core=0x1 00:10:13.187 13:22:31 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:10:13.187 13:22:31 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:10:13.187 13:22:31 -- nvmf/run.sh@29 -- # printf %02d 20 00:10:13.187 13:22:31 -- nvmf/run.sh@29 -- # port=4420 00:10:13.187 13:22:31 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:10:13.187 13:22:31 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:10:13.187 13:22:31 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:10:13.187 13:22:31 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:10:13.187 [2024-07-24 13:22:31.904246] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:10:13.187 [2024-07-24 13:22:31.904328] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3174309 ] 00:10:13.187 EAL: No free 2048 kB hugepages reported on node 1 00:10:13.446 [2024-07-24 13:22:32.158026] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:13.446 [2024-07-24 13:22:32.184125] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:13.446 [2024-07-24 13:22:32.184307] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:13.446 [2024-07-24 13:22:32.238824] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:10:13.446 [2024-07-24 13:22:32.255069] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:10:13.446 INFO: Running with entropic power schedule (0xFF, 100). 00:10:13.446 INFO: Seed: 1331775977 00:10:13.446 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:10:13.446 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:10:13.446 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:10:13.446 INFO: A corpus is not provided, starting from an empty corpus 00:10:13.446 #2 INITED exec/s: 0 rss: 61Mb 00:10:13.446 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:13.446 This may also happen if the target rejected all inputs we tried so far 00:10:13.446 [2024-07-24 13:22:32.310824] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:13.446 [2024-07-24 13:22:32.310864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:13.446 [2024-07-24 13:22:32.310923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:13.446 [2024-07-24 13:22:32.310946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:13.446 [2024-07-24 13:22:32.311013] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:13.446 [2024-07-24 13:22:32.311035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:13.963 NEW_FUNC[1/671]: 0x4c0800 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:10:13.963 NEW_FUNC[2/671]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:10:13.963 #16 NEW cov: 11593 ft: 11594 corp: 2/60b lim: 90 exec/s: 0 rss: 68Mb L: 59/59 MS: 4 CopyPart-ChangeBit-ChangeByte-InsertRepeatedBytes- 00:10:13.963 [2024-07-24 13:22:32.782174] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:13.963 [2024-07-24 13:22:32.782232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:13.963 [2024-07-24 13:22:32.782294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:13.963 [2024-07-24 13:22:32.782315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:13.963 [2024-07-24 13:22:32.782378] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:13.963 [2024-07-24 13:22:32.782400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:13.963 [2024-07-24 13:22:32.782465] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:10:13.963 [2024-07-24 13:22:32.782486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:14.222 NEW_FUNC[1/1]: 0x15406f0 in nvme_qpair_get_state /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1456 00:10:14.222 #17 NEW cov: 11708 ft: 12471 corp: 3/137b lim: 90 exec/s: 0 rss: 69Mb L: 77/77 MS: 1 InsertRepeatedBytes- 00:10:14.222 [2024-07-24 13:22:32.852095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.223 [2024-07-24 13:22:32.852139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.223 [2024-07-24 13:22:32.852206] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:14.223 [2024-07-24 13:22:32.852235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:14.223 [2024-07-24 13:22:32.852300] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:14.223 [2024-07-24 13:22:32.852323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:14.223 #18 NEW cov: 11714 ft: 12683 corp: 4/196b lim: 90 exec/s: 0 rss: 69Mb L: 59/77 MS: 1 ChangeBinInt- 00:10:14.223 [2024-07-24 13:22:32.902246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.223 [2024-07-24 13:22:32.902284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.223 [2024-07-24 13:22:32.902330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:14.223 [2024-07-24 13:22:32.902353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:14.223 [2024-07-24 13:22:32.902417] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:14.223 [2024-07-24 13:22:32.902440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:14.223 #19 NEW cov: 11799 ft: 12927 corp: 5/264b lim: 90 exec/s: 0 rss: 69Mb L: 68/77 MS: 1 InsertRepeatedBytes- 00:10:14.223 [2024-07-24 13:22:32.952129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.223 [2024-07-24 13:22:32.952166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.223 #20 NEW cov: 11799 ft: 13923 corp: 6/297b lim: 90 exec/s: 0 rss: 69Mb L: 33/77 MS: 1 EraseBytes- 00:10:14.223 [2024-07-24 13:22:33.012624] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.223 [2024-07-24 13:22:33.012661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.223 [2024-07-24 13:22:33.012708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:14.223 [2024-07-24 13:22:33.012730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:14.223 [2024-07-24 13:22:33.012796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:14.223 [2024-07-24 13:22:33.012819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:14.223 #21 NEW cov: 11799 ft: 13987 corp: 7/362b lim: 90 exec/s: 0 rss: 69Mb L: 65/77 MS: 1 CopyPart- 00:10:14.223 [2024-07-24 13:22:33.062917] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.223 [2024-07-24 13:22:33.062954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.223 [2024-07-24 13:22:33.063006] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:14.223 [2024-07-24 13:22:33.063026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:14.223 [2024-07-24 13:22:33.063096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:14.223 [2024-07-24 13:22:33.063120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:14.223 [2024-07-24 13:22:33.063184] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:10:14.223 [2024-07-24 13:22:33.063205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:14.517 #22 NEW cov: 11799 ft: 14056 corp: 8/440b lim: 90 exec/s: 0 rss: 69Mb L: 78/78 MS: 1 InsertRepeatedBytes- 00:10:14.517 [2024-07-24 13:22:33.122878] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.517 [2024-07-24 13:22:33.122916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.517 [2024-07-24 13:22:33.122961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:14.517 [2024-07-24 13:22:33.122984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:14.517 [2024-07-24 13:22:33.123048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:14.517 [2024-07-24 13:22:33.123070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:14.517 #23 NEW cov: 11799 ft: 14061 corp: 9/499b lim: 90 exec/s: 0 rss: 69Mb L: 59/78 MS: 1 ChangeByte- 00:10:14.517 [2024-07-24 13:22:33.163198] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.517 [2024-07-24 13:22:33.163243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.517 [2024-07-24 13:22:33.163298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:14.517 [2024-07-24 13:22:33.163321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:14.517 [2024-07-24 13:22:33.163386] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:14.517 [2024-07-24 13:22:33.163409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:14.517 [2024-07-24 13:22:33.163474] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:10:14.517 [2024-07-24 13:22:33.163500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:14.517 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:14.517 #24 NEW cov: 11822 ft: 14082 corp: 10/576b lim: 90 exec/s: 0 rss: 69Mb L: 77/78 MS: 1 ShuffleBytes- 00:10:14.517 [2024-07-24 13:22:33.223180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.517 [2024-07-24 13:22:33.223223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.517 [2024-07-24 13:22:33.223272] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:14.517 [2024-07-24 13:22:33.223293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:14.517 [2024-07-24 13:22:33.223358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:14.517 [2024-07-24 13:22:33.223381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:14.517 #25 NEW cov: 11822 ft: 14134 corp: 11/635b lim: 90 exec/s: 0 rss: 69Mb L: 59/78 MS: 1 ChangeByte- 00:10:14.517 [2024-07-24 13:22:33.283032] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.517 [2024-07-24 13:22:33.283068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.517 #26 NEW cov: 11822 ft: 14168 corp: 12/668b lim: 90 exec/s: 26 rss: 69Mb L: 33/78 MS: 1 ShuffleBytes- 00:10:14.517 [2024-07-24 13:22:33.343706] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.517 [2024-07-24 13:22:33.343742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.517 [2024-07-24 13:22:33.343798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:14.517 [2024-07-24 13:22:33.343819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:14.517 [2024-07-24 13:22:33.343882] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:14.517 [2024-07-24 13:22:33.343904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:14.517 [2024-07-24 13:22:33.343970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:10:14.517 [2024-07-24 13:22:33.343992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:14.776 #27 NEW cov: 11822 ft: 14178 corp: 13/743b lim: 90 exec/s: 27 rss: 70Mb L: 75/78 MS: 1 InsertRepeatedBytes- 00:10:14.776 [2024-07-24 13:22:33.403719] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.776 [2024-07-24 13:22:33.403755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.776 [2024-07-24 13:22:33.403802] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:14.776 [2024-07-24 13:22:33.403824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:14.776 [2024-07-24 13:22:33.403887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:14.776 [2024-07-24 13:22:33.403909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:14.776 #28 NEW cov: 11822 ft: 14193 corp: 14/797b lim: 90 exec/s: 28 rss: 70Mb L: 54/78 MS: 1 EraseBytes- 00:10:14.776 [2024-07-24 13:22:33.463874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.776 [2024-07-24 13:22:33.463910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.776 [2024-07-24 13:22:33.463956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:14.776 [2024-07-24 13:22:33.463979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:14.776 [2024-07-24 13:22:33.464043] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:14.776 [2024-07-24 13:22:33.464066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:14.776 #29 NEW cov: 11822 ft: 14276 corp: 15/856b lim: 90 exec/s: 29 rss: 70Mb L: 59/78 MS: 1 CrossOver- 00:10:14.776 [2024-07-24 13:22:33.514156] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.776 [2024-07-24 13:22:33.514193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.777 [2024-07-24 13:22:33.514245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:14.777 [2024-07-24 13:22:33.514273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:14.777 [2024-07-24 13:22:33.514335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:14.777 [2024-07-24 13:22:33.514356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:14.777 [2024-07-24 13:22:33.514419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:10:14.777 [2024-07-24 13:22:33.514442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:14.777 #30 NEW cov: 11822 ft: 14309 corp: 16/931b lim: 90 exec/s: 30 rss: 70Mb L: 75/78 MS: 1 CrossOver- 00:10:14.777 [2024-07-24 13:22:33.574190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.777 [2024-07-24 13:22:33.574232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.777 [2024-07-24 13:22:33.574278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:14.777 [2024-07-24 13:22:33.574300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:14.777 [2024-07-24 13:22:33.574362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:14.777 [2024-07-24 13:22:33.574385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:14.777 #31 NEW cov: 11822 ft: 14362 corp: 17/991b lim: 90 exec/s: 31 rss: 70Mb L: 60/78 MS: 1 InsertByte- 00:10:14.777 [2024-07-24 13:22:33.634362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.777 [2024-07-24 13:22:33.634399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.777 [2024-07-24 13:22:33.634445] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:14.777 [2024-07-24 13:22:33.634467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:14.777 [2024-07-24 13:22:33.634533] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:14.777 [2024-07-24 13:22:33.634556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:15.035 #32 NEW cov: 11822 ft: 14375 corp: 18/1055b lim: 90 exec/s: 32 rss: 70Mb L: 64/78 MS: 1 CrossOver- 00:10:15.035 [2024-07-24 13:22:33.684670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:15.035 [2024-07-24 13:22:33.684708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:15.035 [2024-07-24 13:22:33.684753] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:15.035 [2024-07-24 13:22:33.684775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:15.035 [2024-07-24 13:22:33.684839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:15.035 [2024-07-24 13:22:33.684861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:15.035 [2024-07-24 13:22:33.684927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:10:15.035 [2024-07-24 13:22:33.684949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:15.035 #33 NEW cov: 11822 ft: 14391 corp: 19/1132b lim: 90 exec/s: 33 rss: 70Mb L: 77/78 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\001"- 00:10:15.035 [2024-07-24 13:22:33.734741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:15.035 [2024-07-24 13:22:33.734778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:15.035 [2024-07-24 13:22:33.734835] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:15.035 [2024-07-24 13:22:33.734856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:15.035 [2024-07-24 13:22:33.734919] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:15.035 [2024-07-24 13:22:33.734941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:15.035 [2024-07-24 13:22:33.735004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:10:15.035 [2024-07-24 13:22:33.735025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:15.035 #34 NEW cov: 11822 ft: 14395 corp: 20/1207b lim: 90 exec/s: 34 rss: 70Mb L: 75/78 MS: 1 ChangeByte- 00:10:15.035 [2024-07-24 13:22:33.794794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:15.035 [2024-07-24 13:22:33.794832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:15.035 [2024-07-24 13:22:33.794877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:15.035 [2024-07-24 13:22:33.794899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:15.035 [2024-07-24 13:22:33.794963] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:15.035 [2024-07-24 13:22:33.794984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:15.035 #35 NEW cov: 11822 ft: 14423 corp: 21/1266b lim: 90 exec/s: 35 rss: 70Mb L: 59/78 MS: 1 CopyPart- 00:10:15.035 [2024-07-24 13:22:33.845126] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:15.035 [2024-07-24 13:22:33.845162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:15.035 [2024-07-24 13:22:33.845209] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:15.035 [2024-07-24 13:22:33.845236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:15.035 [2024-07-24 13:22:33.845300] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:15.035 [2024-07-24 13:22:33.845323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:15.035 [2024-07-24 13:22:33.845379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:10:15.035 [2024-07-24 13:22:33.845401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:15.035 #36 NEW cov: 11822 ft: 14441 corp: 22/1352b lim: 90 exec/s: 36 rss: 70Mb L: 86/86 MS: 1 InsertRepeatedBytes- 00:10:15.294 [2024-07-24 13:22:33.905326] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:15.294 [2024-07-24 13:22:33.905364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:15.294 [2024-07-24 13:22:33.905417] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:15.294 [2024-07-24 13:22:33.905438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:15.294 [2024-07-24 13:22:33.905506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:15.294 [2024-07-24 13:22:33.905529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:15.294 [2024-07-24 13:22:33.905587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:10:15.294 [2024-07-24 13:22:33.905608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:15.294 #37 NEW cov: 11822 ft: 14460 corp: 23/1428b lim: 90 exec/s: 37 rss: 70Mb L: 76/86 MS: 1 InsertRepeatedBytes- 00:10:15.294 [2024-07-24 13:22:33.965178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:15.294 [2024-07-24 13:22:33.965220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:15.294 [2024-07-24 13:22:33.965279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:15.294 [2024-07-24 13:22:33.965301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:15.294 #38 NEW cov: 11822 ft: 14791 corp: 24/1474b lim: 90 exec/s: 38 rss: 70Mb L: 46/86 MS: 1 EraseBytes- 00:10:15.294 [2024-07-24 13:22:34.025835] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:15.294 [2024-07-24 13:22:34.025873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:15.294 [2024-07-24 13:22:34.025927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:15.294 [2024-07-24 13:22:34.025949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:15.294 [2024-07-24 13:22:34.026011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:15.294 [2024-07-24 13:22:34.026033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:15.294 [2024-07-24 13:22:34.026100] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:10:15.294 [2024-07-24 13:22:34.026123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:15.294 #39 NEW cov: 11822 ft: 14895 corp: 25/1557b lim: 90 exec/s: 39 rss: 70Mb L: 83/86 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\001"- 00:10:15.294 [2024-07-24 13:22:34.075625] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:15.294 [2024-07-24 13:22:34.075662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:15.294 [2024-07-24 13:22:34.075707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:15.294 [2024-07-24 13:22:34.075730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:15.295 [2024-07-24 13:22:34.075793] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:15.295 [2024-07-24 13:22:34.075814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:15.295 #40 NEW cov: 11822 ft: 14910 corp: 26/1621b lim: 90 exec/s: 40 rss: 70Mb L: 64/86 MS: 1 ChangeBinInt- 00:10:15.295 [2024-07-24 13:22:34.135932] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:15.295 [2024-07-24 13:22:34.135970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:15.295 [2024-07-24 13:22:34.136021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:15.295 [2024-07-24 13:22:34.136044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:15.295 [2024-07-24 13:22:34.136110] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:15.295 [2024-07-24 13:22:34.136132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:15.295 [2024-07-24 13:22:34.136197] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:10:15.295 [2024-07-24 13:22:34.136226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:15.554 #41 NEW cov: 11822 ft: 14933 corp: 27/1698b lim: 90 exec/s: 41 rss: 70Mb L: 77/86 MS: 1 ChangeBit- 00:10:15.554 [2024-07-24 13:22:34.186096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:15.554 [2024-07-24 13:22:34.186133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:15.554 [2024-07-24 13:22:34.186187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:15.554 [2024-07-24 13:22:34.186218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:15.554 [2024-07-24 13:22:34.186281] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:15.554 [2024-07-24 13:22:34.186303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:15.554 [2024-07-24 13:22:34.186366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:10:15.554 [2024-07-24 13:22:34.186389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:15.554 #42 NEW cov: 11822 ft: 14958 corp: 28/1777b lim: 90 exec/s: 42 rss: 70Mb L: 79/86 MS: 1 InsertRepeatedBytes- 00:10:15.554 [2024-07-24 13:22:34.236260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:15.554 [2024-07-24 13:22:34.236299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:15.555 [2024-07-24 13:22:34.236353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:15.555 [2024-07-24 13:22:34.236374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:15.555 [2024-07-24 13:22:34.236439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:15.555 [2024-07-24 13:22:34.236461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:15.555 [2024-07-24 13:22:34.236522] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:10:15.555 [2024-07-24 13:22:34.236545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:15.555 #43 NEW cov: 11822 ft: 14962 corp: 29/1852b lim: 90 exec/s: 43 rss: 70Mb L: 75/86 MS: 1 ShuffleBytes- 00:10:15.555 [2024-07-24 13:22:34.276200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:15.555 [2024-07-24 13:22:34.276245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:15.555 [2024-07-24 13:22:34.276288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:15.555 [2024-07-24 13:22:34.276310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:15.555 [2024-07-24 13:22:34.276379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:15.555 [2024-07-24 13:22:34.276402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:15.555 #44 NEW cov: 11822 ft: 14964 corp: 30/1906b lim: 90 exec/s: 22 rss: 70Mb L: 54/86 MS: 1 CrossOver- 00:10:15.555 #44 DONE cov: 11822 ft: 14964 corp: 30/1906b lim: 90 exec/s: 22 rss: 70Mb 00:10:15.555 ###### Recommended dictionary. ###### 00:10:15.555 "\000\000\000\000\000\000\000\001" # Uses: 1 00:10:15.555 ###### End of recommended dictionary. ###### 00:10:15.555 Done 44 runs in 2 second(s) 00:10:15.815 13:22:34 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:10:15.815 13:22:34 -- ../common.sh@72 -- # (( i++ )) 00:10:15.815 13:22:34 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:15.815 13:22:34 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:10:15.815 13:22:34 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:10:15.815 13:22:34 -- nvmf/run.sh@24 -- # local timen=1 00:10:15.815 13:22:34 -- nvmf/run.sh@25 -- # local core=0x1 00:10:15.815 13:22:34 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:10:15.815 13:22:34 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:10:15.815 13:22:34 -- nvmf/run.sh@29 -- # printf %02d 21 00:10:15.815 13:22:34 -- nvmf/run.sh@29 -- # port=4421 00:10:15.815 13:22:34 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:10:15.815 13:22:34 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:10:15.815 13:22:34 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:10:15.815 13:22:34 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:10:15.815 [2024-07-24 13:22:34.497146] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:10:15.815 [2024-07-24 13:22:34.497232] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3174679 ] 00:10:15.815 EAL: No free 2048 kB hugepages reported on node 1 00:10:16.074 [2024-07-24 13:22:34.748559] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:16.074 [2024-07-24 13:22:34.774716] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:16.074 [2024-07-24 13:22:34.774888] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:16.074 [2024-07-24 13:22:34.829528] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:10:16.074 [2024-07-24 13:22:34.845775] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:10:16.074 INFO: Running with entropic power schedule (0xFF, 100). 00:10:16.074 INFO: Seed: 3922767573 00:10:16.074 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:10:16.074 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:10:16.074 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:10:16.074 INFO: A corpus is not provided, starting from an empty corpus 00:10:16.074 #2 INITED exec/s: 0 rss: 61Mb 00:10:16.074 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:16.074 This may also happen if the target rejected all inputs we tried so far 00:10:16.074 [2024-07-24 13:22:34.901005] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:16.074 [2024-07-24 13:22:34.901052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:16.074 [2024-07-24 13:22:34.901105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:16.074 [2024-07-24 13:22:34.901139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:16.592 NEW_FUNC[1/669]: 0x4c3a20 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:10:16.592 NEW_FUNC[2/669]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:10:16.592 #5 NEW cov: 11551 ft: 11531 corp: 2/27b lim: 50 exec/s: 0 rss: 68Mb L: 26/26 MS: 3 ShuffleBytes-InsertByte-InsertRepeatedBytes- 00:10:16.592 [2024-07-24 13:22:35.261690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:16.592 [2024-07-24 13:22:35.261746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:16.592 NEW_FUNC[1/3]: 0xf81f50 in posix_sock_recv /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/sock/posix/posix.c:1597 00:10:16.592 NEW_FUNC[2/3]: 0xf822b0 in posix_sock_readv /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/sock/posix/posix.c:1547 00:10:16.592 #6 NEW cov: 11683 ft: 12684 corp: 3/43b lim: 50 exec/s: 0 rss: 69Mb L: 16/26 MS: 1 EraseBytes- 00:10:16.592 [2024-07-24 13:22:35.361900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:16.592 [2024-07-24 13:22:35.361948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:16.592 [2024-07-24 13:22:35.362001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:16.592 [2024-07-24 13:22:35.362029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:16.592 #12 NEW cov: 11689 ft: 13034 corp: 4/67b lim: 50 exec/s: 0 rss: 69Mb L: 24/26 MS: 1 CopyPart- 00:10:16.592 [2024-07-24 13:22:35.452339] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:16.592 [2024-07-24 13:22:35.452389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:16.592 [2024-07-24 13:22:35.452442] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:16.592 [2024-07-24 13:22:35.452470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:16.592 [2024-07-24 13:22:35.452516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:10:16.592 [2024-07-24 13:22:35.452541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:16.592 [2024-07-24 13:22:35.452586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:10:16.592 [2024-07-24 13:22:35.452611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:16.851 #16 NEW cov: 11774 ft: 13647 corp: 5/107b lim: 50 exec/s: 0 rss: 69Mb L: 40/40 MS: 4 ShuffleBytes-ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:10:16.852 [2024-07-24 13:22:35.532372] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:16.852 [2024-07-24 13:22:35.532416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:16.852 [2024-07-24 13:22:35.532468] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:16.852 [2024-07-24 13:22:35.532496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:16.852 #17 NEW cov: 11774 ft: 13749 corp: 6/130b lim: 50 exec/s: 0 rss: 69Mb L: 23/40 MS: 1 InsertRepeatedBytes- 00:10:16.852 [2024-07-24 13:22:35.602406] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:16.852 [2024-07-24 13:22:35.602455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:16.852 #18 NEW cov: 11774 ft: 13854 corp: 7/146b lim: 50 exec/s: 0 rss: 69Mb L: 16/40 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\017"- 00:10:16.852 [2024-07-24 13:22:35.682815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:16.852 [2024-07-24 13:22:35.682858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:16.852 [2024-07-24 13:22:35.682910] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:16.852 [2024-07-24 13:22:35.682939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:16.852 [2024-07-24 13:22:35.682985] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:10:16.852 [2024-07-24 13:22:35.683011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:17.111 #20 NEW cov: 11774 ft: 14143 corp: 8/183b lim: 50 exec/s: 0 rss: 69Mb L: 37/40 MS: 2 ChangeBit-InsertRepeatedBytes- 00:10:17.111 [2024-07-24 13:22:35.753148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:17.111 [2024-07-24 13:22:35.753190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:17.111 [2024-07-24 13:22:35.753248] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:17.111 [2024-07-24 13:22:35.753276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:17.111 [2024-07-24 13:22:35.753325] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:10:17.111 [2024-07-24 13:22:35.753350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:17.111 [2024-07-24 13:22:35.753395] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:10:17.111 [2024-07-24 13:22:35.753421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:17.111 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:17.111 #21 NEW cov: 11791 ft: 14199 corp: 9/228b lim: 50 exec/s: 0 rss: 69Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:10:17.111 [2024-07-24 13:22:35.853278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:17.111 [2024-07-24 13:22:35.853322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:17.111 [2024-07-24 13:22:35.853373] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:17.111 [2024-07-24 13:22:35.853402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:17.111 [2024-07-24 13:22:35.853452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:10:17.111 [2024-07-24 13:22:35.853478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:17.111 #22 NEW cov: 11791 ft: 14268 corp: 10/265b lim: 50 exec/s: 22 rss: 69Mb L: 37/45 MS: 1 ChangeBit- 00:10:17.111 [2024-07-24 13:22:35.953601] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:17.111 [2024-07-24 13:22:35.953644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:17.111 [2024-07-24 13:22:35.953700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:17.111 [2024-07-24 13:22:35.953729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:17.111 [2024-07-24 13:22:35.953777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:10:17.111 [2024-07-24 13:22:35.953803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:17.369 #23 NEW cov: 11791 ft: 14340 corp: 11/302b lim: 50 exec/s: 23 rss: 69Mb L: 37/45 MS: 1 ShuffleBytes- 00:10:17.369 [2024-07-24 13:22:36.053949] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:17.369 [2024-07-24 13:22:36.053993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:17.369 [2024-07-24 13:22:36.054044] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:17.369 [2024-07-24 13:22:36.054071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:17.369 [2024-07-24 13:22:36.054119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:10:17.369 [2024-07-24 13:22:36.054145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:17.369 [2024-07-24 13:22:36.054190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:10:17.369 [2024-07-24 13:22:36.054225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:17.369 #24 NEW cov: 11791 ft: 14420 corp: 12/344b lim: 50 exec/s: 24 rss: 69Mb L: 42/45 MS: 1 CopyPart- 00:10:17.369 [2024-07-24 13:22:36.153952] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:17.369 [2024-07-24 13:22:36.153996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:17.369 #25 NEW cov: 11791 ft: 14441 corp: 13/360b lim: 50 exec/s: 25 rss: 69Mb L: 16/45 MS: 1 ChangeByte- 00:10:17.628 [2024-07-24 13:22:36.254514] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:17.628 [2024-07-24 13:22:36.254559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:17.628 [2024-07-24 13:22:36.254611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:17.628 [2024-07-24 13:22:36.254639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:17.628 [2024-07-24 13:22:36.254686] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:10:17.628 [2024-07-24 13:22:36.254712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:17.628 [2024-07-24 13:22:36.254757] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:10:17.628 [2024-07-24 13:22:36.254781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:17.628 #26 NEW cov: 11791 ft: 14471 corp: 14/409b lim: 50 exec/s: 26 rss: 70Mb L: 49/49 MS: 1 CopyPart- 00:10:17.628 [2024-07-24 13:22:36.354589] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:17.628 [2024-07-24 13:22:36.354632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:17.628 [2024-07-24 13:22:36.354684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:17.628 [2024-07-24 13:22:36.354712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:17.628 #27 NEW cov: 11791 ft: 14480 corp: 15/432b lim: 50 exec/s: 27 rss: 70Mb L: 23/49 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\017"- 00:10:17.628 [2024-07-24 13:22:36.444940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:17.628 [2024-07-24 13:22:36.444984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:17.628 [2024-07-24 13:22:36.445037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:17.628 [2024-07-24 13:22:36.445065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:17.628 [2024-07-24 13:22:36.445112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:10:17.628 [2024-07-24 13:22:36.445138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:17.887 #28 NEW cov: 11791 ft: 14514 corp: 16/469b lim: 50 exec/s: 28 rss: 70Mb L: 37/49 MS: 1 CrossOver- 00:10:17.887 [2024-07-24 13:22:36.514989] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:17.887 [2024-07-24 13:22:36.515034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:17.887 [2024-07-24 13:22:36.515086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:17.887 [2024-07-24 13:22:36.515114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:17.887 #29 NEW cov: 11791 ft: 14516 corp: 17/490b lim: 50 exec/s: 29 rss: 70Mb L: 21/49 MS: 1 EraseBytes- 00:10:17.887 [2024-07-24 13:22:36.605180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:17.887 [2024-07-24 13:22:36.605232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:17.887 #30 NEW cov: 11791 ft: 14539 corp: 18/506b lim: 50 exec/s: 30 rss: 70Mb L: 16/49 MS: 1 ChangeBit- 00:10:17.887 [2024-07-24 13:22:36.705745] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:17.887 [2024-07-24 13:22:36.705788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:17.887 [2024-07-24 13:22:36.705839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:17.887 [2024-07-24 13:22:36.705867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:17.887 [2024-07-24 13:22:36.705913] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:10:17.887 [2024-07-24 13:22:36.705939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:17.887 [2024-07-24 13:22:36.705984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:10:17.887 [2024-07-24 13:22:36.706008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:18.146 #31 NEW cov: 11791 ft: 14573 corp: 19/551b lim: 50 exec/s: 31 rss: 70Mb L: 45/49 MS: 1 CopyPart- 00:10:18.146 [2024-07-24 13:22:36.775810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:18.146 [2024-07-24 13:22:36.775854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:18.146 [2024-07-24 13:22:36.775907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:18.146 [2024-07-24 13:22:36.775935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:18.146 [2024-07-24 13:22:36.775988] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:10:18.146 [2024-07-24 13:22:36.776015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:18.146 #32 NEW cov: 11798 ft: 14595 corp: 20/589b lim: 50 exec/s: 32 rss: 70Mb L: 38/49 MS: 1 InsertByte- 00:10:18.146 [2024-07-24 13:22:36.876229] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:18.146 [2024-07-24 13:22:36.876271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:18.146 [2024-07-24 13:22:36.876321] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:18.146 [2024-07-24 13:22:36.876348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:18.146 [2024-07-24 13:22:36.876396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:10:18.146 [2024-07-24 13:22:36.876421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:18.146 [2024-07-24 13:22:36.876465] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:10:18.146 [2024-07-24 13:22:36.876490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:18.146 #33 NEW cov: 11798 ft: 14621 corp: 21/634b lim: 50 exec/s: 16 rss: 70Mb L: 45/49 MS: 1 ChangeBit- 00:10:18.146 #33 DONE cov: 11798 ft: 14621 corp: 21/634b lim: 50 exec/s: 16 rss: 70Mb 00:10:18.146 ###### Recommended dictionary. ###### 00:10:18.146 "\001\000\000\000\000\000\000\017" # Uses: 1 00:10:18.146 ###### End of recommended dictionary. ###### 00:10:18.146 Done 33 runs in 2 second(s) 00:10:18.405 13:22:37 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:10:18.405 13:22:37 -- ../common.sh@72 -- # (( i++ )) 00:10:18.405 13:22:37 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:18.405 13:22:37 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:10:18.405 13:22:37 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:10:18.405 13:22:37 -- nvmf/run.sh@24 -- # local timen=1 00:10:18.405 13:22:37 -- nvmf/run.sh@25 -- # local core=0x1 00:10:18.405 13:22:37 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:10:18.405 13:22:37 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:10:18.405 13:22:37 -- nvmf/run.sh@29 -- # printf %02d 22 00:10:18.405 13:22:37 -- nvmf/run.sh@29 -- # port=4422 00:10:18.405 13:22:37 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:10:18.405 13:22:37 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:10:18.405 13:22:37 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:10:18.405 13:22:37 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:10:18.405 [2024-07-24 13:22:37.109281] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:10:18.405 [2024-07-24 13:22:37.109356] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3175043 ] 00:10:18.405 EAL: No free 2048 kB hugepages reported on node 1 00:10:18.664 [2024-07-24 13:22:37.363098] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:18.664 [2024-07-24 13:22:37.389110] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:18.664 [2024-07-24 13:22:37.389294] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:18.664 [2024-07-24 13:22:37.443798] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:10:18.664 [2024-07-24 13:22:37.460048] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:10:18.664 INFO: Running with entropic power schedule (0xFF, 100). 00:10:18.664 INFO: Seed: 2241808313 00:10:18.664 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:10:18.664 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:10:18.664 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:10:18.664 INFO: A corpus is not provided, starting from an empty corpus 00:10:18.664 #2 INITED exec/s: 0 rss: 61Mb 00:10:18.664 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:18.664 This may also happen if the target rejected all inputs we tried so far 00:10:18.664 [2024-07-24 13:22:37.515672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:18.664 [2024-07-24 13:22:37.515714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:18.664 [2024-07-24 13:22:37.515783] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:18.664 [2024-07-24 13:22:37.515806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.182 NEW_FUNC[1/672]: 0x4c5ce0 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:10:19.182 NEW_FUNC[2/672]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:10:19.182 #5 NEW cov: 11594 ft: 11571 corp: 2/51b lim: 85 exec/s: 0 rss: 68Mb L: 50/50 MS: 3 CrossOver-ShuffleBytes-InsertRepeatedBytes- 00:10:19.182 [2024-07-24 13:22:37.987050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.182 [2024-07-24 13:22:37.987101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.182 [2024-07-24 13:22:37.987167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.182 [2024-07-24 13:22:37.987189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.182 [2024-07-24 13:22:37.987261] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:19.182 [2024-07-24 13:22:37.987283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:19.182 #6 NEW cov: 11709 ft: 12389 corp: 3/102b lim: 85 exec/s: 0 rss: 68Mb L: 51/51 MS: 1 CrossOver- 00:10:19.182 [2024-07-24 13:22:38.046887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.182 [2024-07-24 13:22:38.046932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.182 [2024-07-24 13:22:38.047000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.182 [2024-07-24 13:22:38.047022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.441 #9 NEW cov: 11715 ft: 12579 corp: 4/137b lim: 85 exec/s: 0 rss: 69Mb L: 35/51 MS: 3 CrossOver-InsertByte-InsertRepeatedBytes- 00:10:19.441 [2024-07-24 13:22:38.097022] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.441 [2024-07-24 13:22:38.097060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.441 [2024-07-24 13:22:38.097127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.441 [2024-07-24 13:22:38.097152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.441 #10 NEW cov: 11800 ft: 12794 corp: 5/187b lim: 85 exec/s: 0 rss: 69Mb L: 50/51 MS: 1 ChangeBinInt- 00:10:19.441 [2024-07-24 13:22:38.147366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.441 [2024-07-24 13:22:38.147402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.441 [2024-07-24 13:22:38.147449] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.441 [2024-07-24 13:22:38.147472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.441 [2024-07-24 13:22:38.147539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:19.441 [2024-07-24 13:22:38.147560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:19.441 #11 NEW cov: 11800 ft: 12932 corp: 6/253b lim: 85 exec/s: 0 rss: 69Mb L: 66/66 MS: 1 CopyPart- 00:10:19.441 [2024-07-24 13:22:38.207764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.441 [2024-07-24 13:22:38.207801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.441 [2024-07-24 13:22:38.207852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.441 [2024-07-24 13:22:38.207873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.441 [2024-07-24 13:22:38.207939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:19.441 [2024-07-24 13:22:38.207961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:19.441 [2024-07-24 13:22:38.208026] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:19.441 [2024-07-24 13:22:38.208048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:19.441 #17 NEW cov: 11800 ft: 13340 corp: 7/323b lim: 85 exec/s: 0 rss: 69Mb L: 70/70 MS: 1 InsertRepeatedBytes- 00:10:19.441 [2024-07-24 13:22:38.257896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.441 [2024-07-24 13:22:38.257933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.441 [2024-07-24 13:22:38.257988] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.441 [2024-07-24 13:22:38.258009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.441 [2024-07-24 13:22:38.258072] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:19.441 [2024-07-24 13:22:38.258094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:19.441 [2024-07-24 13:22:38.258158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:19.441 [2024-07-24 13:22:38.258179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:19.441 #18 NEW cov: 11800 ft: 13457 corp: 8/393b lim: 85 exec/s: 0 rss: 69Mb L: 70/70 MS: 1 ChangeBinInt- 00:10:19.699 [2024-07-24 13:22:38.317675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.699 [2024-07-24 13:22:38.317712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.699 [2024-07-24 13:22:38.317784] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.699 [2024-07-24 13:22:38.317808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.699 #19 NEW cov: 11800 ft: 13496 corp: 9/429b lim: 85 exec/s: 0 rss: 69Mb L: 36/70 MS: 1 InsertRepeatedBytes- 00:10:19.699 [2024-07-24 13:22:38.368223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.699 [2024-07-24 13:22:38.368259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.699 [2024-07-24 13:22:38.368309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.699 [2024-07-24 13:22:38.368333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.699 [2024-07-24 13:22:38.368401] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:19.699 [2024-07-24 13:22:38.368424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:19.699 [2024-07-24 13:22:38.368493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:19.699 [2024-07-24 13:22:38.368515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:19.699 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:19.699 #20 NEW cov: 11823 ft: 13526 corp: 10/499b lim: 85 exec/s: 0 rss: 69Mb L: 70/70 MS: 1 ChangeBit- 00:10:19.699 [2024-07-24 13:22:38.428190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.699 [2024-07-24 13:22:38.428230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.699 [2024-07-24 13:22:38.428279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.699 [2024-07-24 13:22:38.428302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.699 [2024-07-24 13:22:38.428372] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:19.700 [2024-07-24 13:22:38.428395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:19.700 #21 NEW cov: 11823 ft: 13555 corp: 11/551b lim: 85 exec/s: 0 rss: 69Mb L: 52/70 MS: 1 InsertByte- 00:10:19.700 [2024-07-24 13:22:38.487946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.700 [2024-07-24 13:22:38.487982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.700 #22 NEW cov: 11823 ft: 14409 corp: 12/571b lim: 85 exec/s: 22 rss: 69Mb L: 20/70 MS: 1 CrossOver- 00:10:19.700 [2024-07-24 13:22:38.538674] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.700 [2024-07-24 13:22:38.538710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.700 [2024-07-24 13:22:38.538767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.700 [2024-07-24 13:22:38.538790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.700 [2024-07-24 13:22:38.538853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:19.700 [2024-07-24 13:22:38.538876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:19.700 [2024-07-24 13:22:38.538948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:19.700 [2024-07-24 13:22:38.538971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:19.958 #23 NEW cov: 11823 ft: 14428 corp: 13/652b lim: 85 exec/s: 23 rss: 69Mb L: 81/81 MS: 1 InsertRepeatedBytes- 00:10:19.958 [2024-07-24 13:22:38.588650] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.958 [2024-07-24 13:22:38.588687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.958 [2024-07-24 13:22:38.588736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.958 [2024-07-24 13:22:38.588759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.958 [2024-07-24 13:22:38.588827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:19.958 [2024-07-24 13:22:38.588849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:19.958 #24 NEW cov: 11823 ft: 14446 corp: 14/704b lim: 85 exec/s: 24 rss: 69Mb L: 52/81 MS: 1 ChangeByte- 00:10:19.958 [2024-07-24 13:22:38.649013] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.958 [2024-07-24 13:22:38.649049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.958 [2024-07-24 13:22:38.649098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.958 [2024-07-24 13:22:38.649120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.958 [2024-07-24 13:22:38.649187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:19.958 [2024-07-24 13:22:38.649216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:19.958 [2024-07-24 13:22:38.649284] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:19.958 [2024-07-24 13:22:38.649307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:19.958 #25 NEW cov: 11823 ft: 14485 corp: 15/786b lim: 85 exec/s: 25 rss: 69Mb L: 82/82 MS: 1 InsertByte- 00:10:19.958 [2024-07-24 13:22:38.709180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.958 [2024-07-24 13:22:38.709219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.958 [2024-07-24 13:22:38.709275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.958 [2024-07-24 13:22:38.709297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.958 [2024-07-24 13:22:38.709361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:19.958 [2024-07-24 13:22:38.709383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:19.958 [2024-07-24 13:22:38.709450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:19.958 [2024-07-24 13:22:38.709472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:19.958 #26 NEW cov: 11823 ft: 14516 corp: 16/868b lim: 85 exec/s: 26 rss: 69Mb L: 82/82 MS: 1 ChangeBinInt- 00:10:19.958 [2024-07-24 13:22:38.769352] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.958 [2024-07-24 13:22:38.769392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.958 [2024-07-24 13:22:38.769437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.958 [2024-07-24 13:22:38.769459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.958 [2024-07-24 13:22:38.769524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:19.958 [2024-07-24 13:22:38.769546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:19.958 [2024-07-24 13:22:38.769610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:19.958 [2024-07-24 13:22:38.769631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:19.958 #27 NEW cov: 11823 ft: 14536 corp: 17/949b lim: 85 exec/s: 27 rss: 69Mb L: 81/82 MS: 1 CopyPart- 00:10:19.959 [2024-07-24 13:22:38.819492] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.959 [2024-07-24 13:22:38.819528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.959 [2024-07-24 13:22:38.819586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.959 [2024-07-24 13:22:38.819607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.959 [2024-07-24 13:22:38.819672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:19.959 [2024-07-24 13:22:38.819695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:19.959 [2024-07-24 13:22:38.819761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:19.959 [2024-07-24 13:22:38.819782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:20.217 #28 NEW cov: 11823 ft: 14559 corp: 18/1033b lim: 85 exec/s: 28 rss: 70Mb L: 84/84 MS: 1 InsertRepeatedBytes- 00:10:20.217 [2024-07-24 13:22:38.879681] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:20.218 [2024-07-24 13:22:38.879717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:20.218 [2024-07-24 13:22:38.879778] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:20.218 [2024-07-24 13:22:38.879801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:20.218 [2024-07-24 13:22:38.879868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:20.218 [2024-07-24 13:22:38.879891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:20.218 [2024-07-24 13:22:38.879956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:20.218 [2024-07-24 13:22:38.879979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:20.218 #29 NEW cov: 11823 ft: 14571 corp: 19/1113b lim: 85 exec/s: 29 rss: 70Mb L: 80/84 MS: 1 InsertRepeatedBytes- 00:10:20.218 [2024-07-24 13:22:38.929630] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:20.218 [2024-07-24 13:22:38.929665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:20.218 [2024-07-24 13:22:38.929721] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:20.218 [2024-07-24 13:22:38.929744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:20.218 [2024-07-24 13:22:38.929811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:20.218 [2024-07-24 13:22:38.929833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:20.218 #30 NEW cov: 11823 ft: 14596 corp: 20/1179b lim: 85 exec/s: 30 rss: 70Mb L: 66/84 MS: 1 CrossOver- 00:10:20.218 [2024-07-24 13:22:38.990022] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:20.218 [2024-07-24 13:22:38.990057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:20.218 [2024-07-24 13:22:38.990106] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:20.218 [2024-07-24 13:22:38.990128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:20.218 [2024-07-24 13:22:38.990193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:20.218 [2024-07-24 13:22:38.990217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:20.218 [2024-07-24 13:22:38.990285] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:20.218 [2024-07-24 13:22:38.990307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:20.218 #31 NEW cov: 11823 ft: 14645 corp: 21/1260b lim: 85 exec/s: 31 rss: 70Mb L: 81/84 MS: 1 InsertByte- 00:10:20.218 [2024-07-24 13:22:39.050129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:20.218 [2024-07-24 13:22:39.050165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:20.218 [2024-07-24 13:22:39.050217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:20.218 [2024-07-24 13:22:39.050241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:20.218 [2024-07-24 13:22:39.050306] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:20.218 [2024-07-24 13:22:39.050328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:20.218 [2024-07-24 13:22:39.050396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:20.218 [2024-07-24 13:22:39.050420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:20.218 #32 NEW cov: 11823 ft: 14666 corp: 22/1340b lim: 85 exec/s: 32 rss: 70Mb L: 80/84 MS: 1 ChangeBit- 00:10:20.477 [2024-07-24 13:22:39.100184] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:20.477 [2024-07-24 13:22:39.100227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:20.477 [2024-07-24 13:22:39.100275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:20.477 [2024-07-24 13:22:39.100297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:20.477 [2024-07-24 13:22:39.100362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:20.477 [2024-07-24 13:22:39.100385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:20.478 #33 NEW cov: 11823 ft: 14725 corp: 23/1391b lim: 85 exec/s: 33 rss: 70Mb L: 51/84 MS: 1 ChangeBinInt- 00:10:20.478 [2024-07-24 13:22:39.150461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:20.478 [2024-07-24 13:22:39.150497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:20.478 [2024-07-24 13:22:39.150557] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:20.478 [2024-07-24 13:22:39.150579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:20.478 [2024-07-24 13:22:39.150644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:20.478 [2024-07-24 13:22:39.150666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:20.478 [2024-07-24 13:22:39.150732] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:20.478 [2024-07-24 13:22:39.150753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:20.478 #34 NEW cov: 11823 ft: 14726 corp: 24/1461b lim: 85 exec/s: 34 rss: 70Mb L: 70/84 MS: 1 CopyPart- 00:10:20.478 [2024-07-24 13:22:39.200600] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:20.478 [2024-07-24 13:22:39.200636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:20.478 [2024-07-24 13:22:39.200685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:20.478 [2024-07-24 13:22:39.200707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:20.478 [2024-07-24 13:22:39.200771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:20.478 [2024-07-24 13:22:39.200793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:20.478 [2024-07-24 13:22:39.200858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:20.478 [2024-07-24 13:22:39.200881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:20.478 #35 NEW cov: 11823 ft: 14734 corp: 25/1541b lim: 85 exec/s: 35 rss: 70Mb L: 80/84 MS: 1 ShuffleBytes- 00:10:20.478 [2024-07-24 13:22:39.250785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:20.478 [2024-07-24 13:22:39.250822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:20.478 [2024-07-24 13:22:39.250882] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:20.478 [2024-07-24 13:22:39.250903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:20.478 [2024-07-24 13:22:39.250969] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:20.478 [2024-07-24 13:22:39.250992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:20.478 [2024-07-24 13:22:39.251061] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:20.478 [2024-07-24 13:22:39.251085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:20.478 #36 NEW cov: 11823 ft: 14760 corp: 26/1618b lim: 85 exec/s: 36 rss: 70Mb L: 77/84 MS: 1 InsertRepeatedBytes- 00:10:20.478 [2024-07-24 13:22:39.300593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:20.478 [2024-07-24 13:22:39.300631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:20.478 [2024-07-24 13:22:39.300687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:20.478 [2024-07-24 13:22:39.300709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:20.478 #37 NEW cov: 11823 ft: 14810 corp: 27/1654b lim: 85 exec/s: 37 rss: 70Mb L: 36/84 MS: 1 ShuffleBytes- 00:10:20.737 [2024-07-24 13:22:39.361128] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:20.737 [2024-07-24 13:22:39.361166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:20.737 [2024-07-24 13:22:39.361220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:20.737 [2024-07-24 13:22:39.361243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:20.737 [2024-07-24 13:22:39.361307] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:20.737 [2024-07-24 13:22:39.361330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:20.737 [2024-07-24 13:22:39.361397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:20.737 [2024-07-24 13:22:39.361419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:20.737 #38 NEW cov: 11823 ft: 14855 corp: 28/1736b lim: 85 exec/s: 38 rss: 70Mb L: 82/84 MS: 1 InsertByte- 00:10:20.737 [2024-07-24 13:22:39.411044] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:20.737 [2024-07-24 13:22:39.411081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:20.737 [2024-07-24 13:22:39.411129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:20.737 [2024-07-24 13:22:39.411151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:20.737 [2024-07-24 13:22:39.411224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:20.737 [2024-07-24 13:22:39.411248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:20.737 #39 NEW cov: 11823 ft: 14914 corp: 29/1788b lim: 85 exec/s: 39 rss: 70Mb L: 52/84 MS: 1 ChangeBit- 00:10:20.737 [2024-07-24 13:22:39.471421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:20.737 [2024-07-24 13:22:39.471456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:20.737 [2024-07-24 13:22:39.471517] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:20.737 [2024-07-24 13:22:39.471539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:20.737 [2024-07-24 13:22:39.471605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:20.737 [2024-07-24 13:22:39.471627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:20.737 [2024-07-24 13:22:39.471695] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:20.737 [2024-07-24 13:22:39.471716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:20.737 #40 NEW cov: 11823 ft: 14937 corp: 30/1864b lim: 85 exec/s: 20 rss: 70Mb L: 76/84 MS: 1 InsertRepeatedBytes- 00:10:20.737 #40 DONE cov: 11823 ft: 14937 corp: 30/1864b lim: 85 exec/s: 20 rss: 70Mb 00:10:20.737 Done 40 runs in 2 second(s) 00:10:20.997 13:22:39 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:10:20.997 13:22:39 -- ../common.sh@72 -- # (( i++ )) 00:10:20.997 13:22:39 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:20.997 13:22:39 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:10:20.997 13:22:39 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:10:20.997 13:22:39 -- nvmf/run.sh@24 -- # local timen=1 00:10:20.997 13:22:39 -- nvmf/run.sh@25 -- # local core=0x1 00:10:20.997 13:22:39 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:10:20.997 13:22:39 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:10:20.997 13:22:39 -- nvmf/run.sh@29 -- # printf %02d 23 00:10:20.997 13:22:39 -- nvmf/run.sh@29 -- # port=4423 00:10:20.997 13:22:39 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:10:20.997 13:22:39 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:10:20.997 13:22:39 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:10:20.997 13:22:39 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:10:20.997 [2024-07-24 13:22:39.670316] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:10:20.997 [2024-07-24 13:22:39.670389] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3175407 ] 00:10:20.997 EAL: No free 2048 kB hugepages reported on node 1 00:10:21.256 [2024-07-24 13:22:39.918548] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:21.256 [2024-07-24 13:22:39.944798] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:21.256 [2024-07-24 13:22:39.944967] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:21.256 [2024-07-24 13:22:39.999512] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:10:21.256 [2024-07-24 13:22:40.015762] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:10:21.256 INFO: Running with entropic power schedule (0xFF, 100). 00:10:21.256 INFO: Seed: 500842087 00:10:21.256 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:10:21.256 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:10:21.256 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:10:21.256 INFO: A corpus is not provided, starting from an empty corpus 00:10:21.256 #2 INITED exec/s: 0 rss: 60Mb 00:10:21.256 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:21.256 This may also happen if the target rejected all inputs we tried so far 00:10:21.256 [2024-07-24 13:22:40.074958] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:21.256 [2024-07-24 13:22:40.075010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:21.256 [2024-07-24 13:22:40.075075] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:21.256 [2024-07-24 13:22:40.075096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:21.256 [2024-07-24 13:22:40.075161] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:21.256 [2024-07-24 13:22:40.075181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:21.824 NEW_FUNC[1/671]: 0x4c8f10 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:10:21.824 NEW_FUNC[2/671]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:10:21.824 #21 NEW cov: 11529 ft: 11517 corp: 2/16b lim: 25 exec/s: 0 rss: 68Mb L: 15/15 MS: 4 CrossOver-EraseBytes-ChangeBinInt-InsertRepeatedBytes- 00:10:21.824 [2024-07-24 13:22:40.546293] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:21.824 [2024-07-24 13:22:40.546348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:21.824 [2024-07-24 13:22:40.546418] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:21.824 [2024-07-24 13:22:40.546442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:21.824 [2024-07-24 13:22:40.546511] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:21.824 [2024-07-24 13:22:40.546532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:21.824 #22 NEW cov: 11642 ft: 12088 corp: 3/31b lim: 25 exec/s: 0 rss: 68Mb L: 15/15 MS: 1 ChangeByte- 00:10:21.824 [2024-07-24 13:22:40.606395] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:21.824 [2024-07-24 13:22:40.606434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:21.824 [2024-07-24 13:22:40.606484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:21.824 [2024-07-24 13:22:40.606506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:21.824 [2024-07-24 13:22:40.606576] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:21.824 [2024-07-24 13:22:40.606599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:21.824 #23 NEW cov: 11648 ft: 12343 corp: 4/46b lim: 25 exec/s: 0 rss: 68Mb L: 15/15 MS: 1 ShuffleBytes- 00:10:21.824 [2024-07-24 13:22:40.656391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:21.824 [2024-07-24 13:22:40.656428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:21.824 [2024-07-24 13:22:40.656478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:21.824 [2024-07-24 13:22:40.656501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:21.824 #24 NEW cov: 11733 ft: 12796 corp: 5/58b lim: 25 exec/s: 0 rss: 68Mb L: 12/15 MS: 1 EraseBytes- 00:10:22.084 [2024-07-24 13:22:40.706653] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:22.084 [2024-07-24 13:22:40.706691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.084 [2024-07-24 13:22:40.706742] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:22.084 [2024-07-24 13:22:40.706765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.084 [2024-07-24 13:22:40.706833] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:22.084 [2024-07-24 13:22:40.706855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:22.084 #25 NEW cov: 11733 ft: 12864 corp: 6/73b lim: 25 exec/s: 0 rss: 68Mb L: 15/15 MS: 1 ChangeBit- 00:10:22.084 [2024-07-24 13:22:40.766946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:22.084 [2024-07-24 13:22:40.766983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.084 [2024-07-24 13:22:40.767049] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:22.084 [2024-07-24 13:22:40.767072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.084 [2024-07-24 13:22:40.767140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:22.084 [2024-07-24 13:22:40.767162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:22.084 [2024-07-24 13:22:40.767235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:10:22.084 [2024-07-24 13:22:40.767258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:22.084 #26 NEW cov: 11733 ft: 13355 corp: 7/97b lim: 25 exec/s: 0 rss: 68Mb L: 24/24 MS: 1 CopyPart- 00:10:22.084 [2024-07-24 13:22:40.826981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:22.084 [2024-07-24 13:22:40.827018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.084 [2024-07-24 13:22:40.827070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:22.084 [2024-07-24 13:22:40.827093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.084 [2024-07-24 13:22:40.827162] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:22.084 [2024-07-24 13:22:40.827184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:22.084 #27 NEW cov: 11733 ft: 13398 corp: 8/112b lim: 25 exec/s: 0 rss: 68Mb L: 15/24 MS: 1 ChangeByte- 00:10:22.084 [2024-07-24 13:22:40.877101] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:22.084 [2024-07-24 13:22:40.877138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.084 [2024-07-24 13:22:40.877189] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:22.084 [2024-07-24 13:22:40.877218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.084 [2024-07-24 13:22:40.877277] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:22.084 [2024-07-24 13:22:40.877300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:22.084 #28 NEW cov: 11733 ft: 13428 corp: 9/127b lim: 25 exec/s: 0 rss: 68Mb L: 15/24 MS: 1 ChangeBinInt- 00:10:22.084 [2024-07-24 13:22:40.927233] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:22.084 [2024-07-24 13:22:40.927269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.084 [2024-07-24 13:22:40.927321] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:22.084 [2024-07-24 13:22:40.927344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.084 [2024-07-24 13:22:40.927416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:22.084 [2024-07-24 13:22:40.927437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:22.344 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:22.344 #29 NEW cov: 11756 ft: 13477 corp: 10/142b lim: 25 exec/s: 0 rss: 68Mb L: 15/24 MS: 1 ChangeBit- 00:10:22.344 [2024-07-24 13:22:40.977351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:22.344 [2024-07-24 13:22:40.977388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.344 [2024-07-24 13:22:40.977443] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:22.344 [2024-07-24 13:22:40.977466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.344 [2024-07-24 13:22:40.977539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:22.344 [2024-07-24 13:22:40.977559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:22.344 #30 NEW cov: 11756 ft: 13501 corp: 11/157b lim: 25 exec/s: 0 rss: 68Mb L: 15/24 MS: 1 InsertRepeatedBytes- 00:10:22.344 [2024-07-24 13:22:41.037418] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:22.344 [2024-07-24 13:22:41.037455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.344 [2024-07-24 13:22:41.037513] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:22.344 [2024-07-24 13:22:41.037535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.344 #31 NEW cov: 11756 ft: 13554 corp: 12/170b lim: 25 exec/s: 31 rss: 69Mb L: 13/24 MS: 1 EraseBytes- 00:10:22.344 [2024-07-24 13:22:41.097785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:22.344 [2024-07-24 13:22:41.097822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.344 [2024-07-24 13:22:41.097878] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:22.344 [2024-07-24 13:22:41.097899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.344 [2024-07-24 13:22:41.097972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:22.344 [2024-07-24 13:22:41.097994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:22.344 #32 NEW cov: 11756 ft: 13573 corp: 13/185b lim: 25 exec/s: 32 rss: 69Mb L: 15/24 MS: 1 ChangeByte- 00:10:22.344 [2024-07-24 13:22:41.137696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:22.344 [2024-07-24 13:22:41.137732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.344 [2024-07-24 13:22:41.137793] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:22.344 [2024-07-24 13:22:41.137816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.344 #33 NEW cov: 11756 ft: 13601 corp: 14/198b lim: 25 exec/s: 33 rss: 69Mb L: 13/24 MS: 1 InsertByte- 00:10:22.344 [2024-07-24 13:22:41.187845] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:22.344 [2024-07-24 13:22:41.187882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.344 [2024-07-24 13:22:41.187934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:22.344 [2024-07-24 13:22:41.187961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.603 #34 NEW cov: 11756 ft: 13620 corp: 15/211b lim: 25 exec/s: 34 rss: 69Mb L: 13/24 MS: 1 ChangeBit- 00:10:22.603 [2024-07-24 13:22:41.248063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:22.603 [2024-07-24 13:22:41.248100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.603 [2024-07-24 13:22:41.248155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:22.603 [2024-07-24 13:22:41.248178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.603 #35 NEW cov: 11756 ft: 13675 corp: 16/223b lim: 25 exec/s: 35 rss: 69Mb L: 12/24 MS: 1 CopyPart- 00:10:22.603 [2024-07-24 13:22:41.298468] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:22.603 [2024-07-24 13:22:41.298506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.603 [2024-07-24 13:22:41.298561] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:22.603 [2024-07-24 13:22:41.298582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.603 [2024-07-24 13:22:41.298652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:22.603 [2024-07-24 13:22:41.298674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:22.603 [2024-07-24 13:22:41.298741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:10:22.603 [2024-07-24 13:22:41.298763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:22.603 #36 NEW cov: 11756 ft: 13778 corp: 17/247b lim: 25 exec/s: 36 rss: 69Mb L: 24/24 MS: 1 CrossOver- 00:10:22.603 [2024-07-24 13:22:41.358368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:22.603 [2024-07-24 13:22:41.358407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.603 [2024-07-24 13:22:41.358478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:22.603 [2024-07-24 13:22:41.358501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.603 #37 NEW cov: 11756 ft: 13788 corp: 18/260b lim: 25 exec/s: 37 rss: 69Mb L: 13/24 MS: 1 ShuffleBytes- 00:10:22.603 [2024-07-24 13:22:41.408630] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:22.603 [2024-07-24 13:22:41.408668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.603 [2024-07-24 13:22:41.408717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:22.603 [2024-07-24 13:22:41.408739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.603 [2024-07-24 13:22:41.408807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:22.603 [2024-07-24 13:22:41.408829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:22.603 #38 NEW cov: 11756 ft: 13797 corp: 19/275b lim: 25 exec/s: 38 rss: 69Mb L: 15/24 MS: 1 ShuffleBytes- 00:10:22.603 [2024-07-24 13:22:41.458601] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:22.603 [2024-07-24 13:22:41.458638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.603 [2024-07-24 13:22:41.458699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:22.603 [2024-07-24 13:22:41.458723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.862 #39 NEW cov: 11756 ft: 13810 corp: 20/289b lim: 25 exec/s: 39 rss: 69Mb L: 14/24 MS: 1 InsertByte- 00:10:22.862 [2024-07-24 13:22:41.518812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:22.862 [2024-07-24 13:22:41.518848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.862 [2024-07-24 13:22:41.518899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:22.862 [2024-07-24 13:22:41.518922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.862 #40 NEW cov: 11756 ft: 13847 corp: 21/301b lim: 25 exec/s: 40 rss: 69Mb L: 12/24 MS: 1 CopyPart- 00:10:22.862 [2024-07-24 13:22:41.578983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:22.862 [2024-07-24 13:22:41.579019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.862 [2024-07-24 13:22:41.579071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:22.862 [2024-07-24 13:22:41.579092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.862 #41 NEW cov: 11756 ft: 13863 corp: 22/313b lim: 25 exec/s: 41 rss: 69Mb L: 12/24 MS: 1 ShuffleBytes- 00:10:22.862 [2024-07-24 13:22:41.629605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:22.862 [2024-07-24 13:22:41.629642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.862 [2024-07-24 13:22:41.629707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:22.862 [2024-07-24 13:22:41.629731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.862 [2024-07-24 13:22:41.629801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:22.862 [2024-07-24 13:22:41.629824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:22.862 [2024-07-24 13:22:41.629892] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:10:22.862 [2024-07-24 13:22:41.629913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:22.863 [2024-07-24 13:22:41.629964] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:10:22.863 [2024-07-24 13:22:41.629984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:10:22.863 #42 NEW cov: 11756 ft: 13949 corp: 23/338b lim: 25 exec/s: 42 rss: 69Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:10:22.863 [2024-07-24 13:22:41.679292] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:22.863 [2024-07-24 13:22:41.679330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.863 [2024-07-24 13:22:41.679398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:22.863 [2024-07-24 13:22:41.679421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.863 #43 NEW cov: 11756 ft: 13964 corp: 24/350b lim: 25 exec/s: 43 rss: 69Mb L: 12/25 MS: 1 CrossOver- 00:10:23.122 [2024-07-24 13:22:41.739641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:23.122 [2024-07-24 13:22:41.739677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.122 [2024-07-24 13:22:41.739732] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:23.122 [2024-07-24 13:22:41.739753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.122 [2024-07-24 13:22:41.739823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:23.122 [2024-07-24 13:22:41.739847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.122 #44 NEW cov: 11756 ft: 14017 corp: 25/365b lim: 25 exec/s: 44 rss: 69Mb L: 15/25 MS: 1 CrossOver- 00:10:23.122 [2024-07-24 13:22:41.779728] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:23.122 [2024-07-24 13:22:41.779764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.122 [2024-07-24 13:22:41.779814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:23.122 [2024-07-24 13:22:41.779836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.122 [2024-07-24 13:22:41.779906] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:23.122 [2024-07-24 13:22:41.779930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.122 #45 NEW cov: 11756 ft: 14040 corp: 26/384b lim: 25 exec/s: 45 rss: 69Mb L: 19/25 MS: 1 InsertRepeatedBytes- 00:10:23.122 [2024-07-24 13:22:41.839962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:23.122 [2024-07-24 13:22:41.839999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.122 [2024-07-24 13:22:41.840047] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:23.122 [2024-07-24 13:22:41.840071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.122 [2024-07-24 13:22:41.840145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:23.122 [2024-07-24 13:22:41.840168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.122 #46 NEW cov: 11756 ft: 14058 corp: 27/403b lim: 25 exec/s: 46 rss: 69Mb L: 19/25 MS: 1 ChangeBit- 00:10:23.122 [2024-07-24 13:22:41.899938] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:23.122 [2024-07-24 13:22:41.899978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.122 [2024-07-24 13:22:41.900037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:23.122 [2024-07-24 13:22:41.900060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.122 #47 NEW cov: 11756 ft: 14104 corp: 28/415b lim: 25 exec/s: 47 rss: 70Mb L: 12/25 MS: 1 CrossOver- 00:10:23.122 [2024-07-24 13:22:41.960093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:23.122 [2024-07-24 13:22:41.960131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.122 [2024-07-24 13:22:41.960188] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:23.122 [2024-07-24 13:22:41.960219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.381 #48 NEW cov: 11756 ft: 14113 corp: 29/427b lim: 25 exec/s: 48 rss: 70Mb L: 12/25 MS: 1 ShuffleBytes- 00:10:23.381 [2024-07-24 13:22:42.010427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:23.381 [2024-07-24 13:22:42.010465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.381 [2024-07-24 13:22:42.010520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:23.381 [2024-07-24 13:22:42.010544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.381 [2024-07-24 13:22:42.010614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:23.381 [2024-07-24 13:22:42.010636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.381 #49 NEW cov: 11756 ft: 14121 corp: 30/442b lim: 25 exec/s: 49 rss: 70Mb L: 15/25 MS: 1 ChangeBit- 00:10:23.381 [2024-07-24 13:22:42.050653] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:23.381 [2024-07-24 13:22:42.050691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.381 [2024-07-24 13:22:42.050753] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:23.381 [2024-07-24 13:22:42.050775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.381 [2024-07-24 13:22:42.050847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:23.381 [2024-07-24 13:22:42.050868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.381 [2024-07-24 13:22:42.050937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:10:23.381 [2024-07-24 13:22:42.050961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:23.381 #50 NEW cov: 11756 ft: 14126 corp: 31/466b lim: 25 exec/s: 25 rss: 70Mb L: 24/25 MS: 1 CMP- DE: "\000\000\000\000\000\000\003\377"- 00:10:23.381 #50 DONE cov: 11756 ft: 14126 corp: 31/466b lim: 25 exec/s: 25 rss: 70Mb 00:10:23.381 ###### Recommended dictionary. ###### 00:10:23.381 "\000\000\000\000\000\000\003\377" # Uses: 0 00:10:23.381 ###### End of recommended dictionary. ###### 00:10:23.381 Done 50 runs in 2 second(s) 00:10:23.381 13:22:42 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:10:23.381 13:22:42 -- ../common.sh@72 -- # (( i++ )) 00:10:23.381 13:22:42 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:23.381 13:22:42 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:10:23.381 13:22:42 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:10:23.381 13:22:42 -- nvmf/run.sh@24 -- # local timen=1 00:10:23.381 13:22:42 -- nvmf/run.sh@25 -- # local core=0x1 00:10:23.381 13:22:42 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:10:23.381 13:22:42 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:10:23.381 13:22:42 -- nvmf/run.sh@29 -- # printf %02d 24 00:10:23.381 13:22:42 -- nvmf/run.sh@29 -- # port=4424 00:10:23.382 13:22:42 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:10:23.382 13:22:42 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:10:23.382 13:22:42 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:10:23.382 13:22:42 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:10:23.640 [2024-07-24 13:22:42.272101] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:10:23.640 [2024-07-24 13:22:42.272196] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3175768 ] 00:10:23.640 EAL: No free 2048 kB hugepages reported on node 1 00:10:23.899 [2024-07-24 13:22:42.508882] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:23.899 [2024-07-24 13:22:42.535099] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:23.899 [2024-07-24 13:22:42.535280] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:23.899 [2024-07-24 13:22:42.589799] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:10:23.899 [2024-07-24 13:22:42.606044] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:10:23.899 INFO: Running with entropic power schedule (0xFF, 100). 00:10:23.899 INFO: Seed: 3092842069 00:10:23.899 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:10:23.899 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:10:23.899 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:10:23.899 INFO: A corpus is not provided, starting from an empty corpus 00:10:23.899 #2 INITED exec/s: 0 rss: 61Mb 00:10:23.899 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:23.899 This may also happen if the target rejected all inputs we tried so far 00:10:23.899 [2024-07-24 13:22:42.661370] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6076574517156926548 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.900 [2024-07-24 13:22:42.661416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.900 [2024-07-24 13:22:42.661469] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.900 [2024-07-24 13:22:42.661497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.900 [2024-07-24 13:22:42.661544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.900 [2024-07-24 13:22:42.661570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.900 [2024-07-24 13:22:42.661615] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.900 [2024-07-24 13:22:42.661640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:24.159 NEW_FUNC[1/672]: 0x4c9ff0 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:10:24.159 NEW_FUNC[2/672]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:10:24.159 #3 NEW cov: 11601 ft: 11602 corp: 2/97b lim: 100 exec/s: 0 rss: 68Mb L: 96/96 MS: 1 InsertRepeatedBytes- 00:10:24.159 [2024-07-24 13:22:43.022227] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6076574517156926548 len:21563 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.159 [2024-07-24 13:22:43.022286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.159 [2024-07-24 13:22:43.022340] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.159 [2024-07-24 13:22:43.022373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.159 [2024-07-24 13:22:43.022420] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.159 [2024-07-24 13:22:43.022444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:24.159 [2024-07-24 13:22:43.022489] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.159 [2024-07-24 13:22:43.022515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:24.418 #4 NEW cov: 11714 ft: 12143 corp: 3/194b lim: 100 exec/s: 0 rss: 68Mb L: 97/97 MS: 1 InsertByte- 00:10:24.418 [2024-07-24 13:22:43.122347] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6076574517156926548 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.418 [2024-07-24 13:22:43.122394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.418 [2024-07-24 13:22:43.122445] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.418 [2024-07-24 13:22:43.122473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.418 [2024-07-24 13:22:43.122520] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.418 [2024-07-24 13:22:43.122546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:24.418 [2024-07-24 13:22:43.122590] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.418 [2024-07-24 13:22:43.122615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:24.418 #10 NEW cov: 11720 ft: 12368 corp: 4/290b lim: 100 exec/s: 0 rss: 68Mb L: 96/97 MS: 1 CrossOver- 00:10:24.419 [2024-07-24 13:22:43.192583] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6076574517156926548 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.419 [2024-07-24 13:22:43.192627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.419 [2024-07-24 13:22:43.192678] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.419 [2024-07-24 13:22:43.192705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.419 [2024-07-24 13:22:43.192752] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.419 [2024-07-24 13:22:43.192778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:24.419 [2024-07-24 13:22:43.192821] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.419 [2024-07-24 13:22:43.192846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:24.419 [2024-07-24 13:22:43.192891] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:6076574157621187668 len:85 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.419 [2024-07-24 13:22:43.192921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:10:24.419 #11 NEW cov: 11805 ft: 12653 corp: 5/390b lim: 100 exec/s: 0 rss: 69Mb L: 100/100 MS: 1 InsertRepeatedBytes- 00:10:24.679 [2024-07-24 13:22:43.292877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6076574517156926548 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.679 [2024-07-24 13:22:43.292923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.679 [2024-07-24 13:22:43.292973] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.679 [2024-07-24 13:22:43.293001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.679 [2024-07-24 13:22:43.293048] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.679 [2024-07-24 13:22:43.293074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:24.679 [2024-07-24 13:22:43.293119] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.679 [2024-07-24 13:22:43.293145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:24.679 [2024-07-24 13:22:43.293191] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:6076574157621187668 len:85 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.679 [2024-07-24 13:22:43.293226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:10:24.679 #17 NEW cov: 11805 ft: 12676 corp: 6/490b lim: 100 exec/s: 0 rss: 69Mb L: 100/100 MS: 1 ShuffleBytes- 00:10:24.679 [2024-07-24 13:22:43.383013] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6076574517156926548 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.679 [2024-07-24 13:22:43.383057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.679 [2024-07-24 13:22:43.383109] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.679 [2024-07-24 13:22:43.383136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.679 [2024-07-24 13:22:43.383182] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.679 [2024-07-24 13:22:43.383208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:24.679 [2024-07-24 13:22:43.383262] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.679 [2024-07-24 13:22:43.383288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:24.679 #18 NEW cov: 11805 ft: 12726 corp: 7/585b lim: 100 exec/s: 0 rss: 69Mb L: 95/100 MS: 1 EraseBytes- 00:10:24.679 [2024-07-24 13:22:43.453070] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6076574517156926548 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.679 [2024-07-24 13:22:43.453115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.679 [2024-07-24 13:22:43.453168] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.679 [2024-07-24 13:22:43.453201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.679 #19 NEW cov: 11805 ft: 13176 corp: 8/642b lim: 100 exec/s: 0 rss: 69Mb L: 57/100 MS: 1 EraseBytes- 00:10:24.938 [2024-07-24 13:22:43.553486] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6076574517156926548 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.938 [2024-07-24 13:22:43.553530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.938 [2024-07-24 13:22:43.553582] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.938 [2024-07-24 13:22:43.553609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.938 [2024-07-24 13:22:43.553656] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.939 [2024-07-24 13:22:43.553682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:24.939 [2024-07-24 13:22:43.553727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.939 [2024-07-24 13:22:43.553752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:24.939 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:24.939 #20 NEW cov: 11822 ft: 13222 corp: 9/737b lim: 100 exec/s: 0 rss: 69Mb L: 95/100 MS: 1 ShuffleBytes- 00:10:24.939 [2024-07-24 13:22:43.633759] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6076574517156926548 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.939 [2024-07-24 13:22:43.633803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.939 [2024-07-24 13:22:43.633852] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.939 [2024-07-24 13:22:43.633881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.939 [2024-07-24 13:22:43.633929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.939 [2024-07-24 13:22:43.633955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:24.939 [2024-07-24 13:22:43.633999] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.939 [2024-07-24 13:22:43.634023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:24.939 [2024-07-24 13:22:43.634068] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:6076574157621187668 len:85 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.939 [2024-07-24 13:22:43.634093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:10:24.939 #21 NEW cov: 11822 ft: 13238 corp: 10/837b lim: 100 exec/s: 21 rss: 69Mb L: 100/100 MS: 1 ChangeBinInt- 00:10:24.939 [2024-07-24 13:22:43.723965] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6076574517156926548 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.939 [2024-07-24 13:22:43.724008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.939 [2024-07-24 13:22:43.724064] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6076574518398440532 len:257 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.939 [2024-07-24 13:22:43.724091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.939 [2024-07-24 13:22:43.724137] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.939 [2024-07-24 13:22:43.724162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:24.939 [2024-07-24 13:22:43.724206] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.939 [2024-07-24 13:22:43.724239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:24.939 #22 NEW cov: 11822 ft: 13306 corp: 11/932b lim: 100 exec/s: 22 rss: 69Mb L: 95/100 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:10:24.939 [2024-07-24 13:22:43.794015] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6076574517156926548 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.939 [2024-07-24 13:22:43.794057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.939 [2024-07-24 13:22:43.794108] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.939 [2024-07-24 13:22:43.794135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.939 [2024-07-24 13:22:43.794184] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.939 [2024-07-24 13:22:43.794209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:25.198 #23 NEW cov: 11822 ft: 13598 corp: 12/997b lim: 100 exec/s: 23 rss: 69Mb L: 65/100 MS: 1 EraseBytes- 00:10:25.198 [2024-07-24 13:22:43.864302] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6076574517156926555 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.198 [2024-07-24 13:22:43.864344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:25.198 [2024-07-24 13:22:43.864393] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.198 [2024-07-24 13:22:43.864420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:25.198 [2024-07-24 13:22:43.864467] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.198 [2024-07-24 13:22:43.864492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:25.198 [2024-07-24 13:22:43.864537] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.198 [2024-07-24 13:22:43.864561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:25.198 #24 NEW cov: 11822 ft: 13634 corp: 13/1093b lim: 100 exec/s: 24 rss: 69Mb L: 96/100 MS: 1 ChangeBinInt- 00:10:25.198 [2024-07-24 13:22:43.934506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6076574517156926555 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.198 [2024-07-24 13:22:43.934553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:25.198 [2024-07-24 13:22:43.934603] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.198 [2024-07-24 13:22:43.934631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:25.198 [2024-07-24 13:22:43.934678] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.198 [2024-07-24 13:22:43.934703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:25.198 [2024-07-24 13:22:43.934747] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.198 [2024-07-24 13:22:43.934772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:25.198 #25 NEW cov: 11822 ft: 13662 corp: 14/1192b lim: 100 exec/s: 25 rss: 69Mb L: 99/100 MS: 1 CrossOver- 00:10:25.198 [2024-07-24 13:22:44.034838] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6076574517156926548 len:21563 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.198 [2024-07-24 13:22:44.034881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:25.198 [2024-07-24 13:22:44.034933] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.198 [2024-07-24 13:22:44.034960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:25.198 [2024-07-24 13:22:44.035008] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.198 [2024-07-24 13:22:44.035033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:25.198 [2024-07-24 13:22:44.035078] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.198 [2024-07-24 13:22:44.035102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:25.495 #26 NEW cov: 11822 ft: 13714 corp: 15/1277b lim: 100 exec/s: 26 rss: 69Mb L: 85/100 MS: 1 CrossOver- 00:10:25.495 [2024-07-24 13:22:44.125145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6076574517156926548 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.495 [2024-07-24 13:22:44.125189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:25.495 [2024-07-24 13:22:44.125246] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.495 [2024-07-24 13:22:44.125275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:25.495 [2024-07-24 13:22:44.125323] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.495 [2024-07-24 13:22:44.125350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:25.495 [2024-07-24 13:22:44.125394] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.495 [2024-07-24 13:22:44.125419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:25.495 [2024-07-24 13:22:44.125470] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:6076574157621187668 len:85 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.495 [2024-07-24 13:22:44.125496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:10:25.495 #27 NEW cov: 11822 ft: 13761 corp: 16/1377b lim: 100 exec/s: 27 rss: 70Mb L: 100/100 MS: 1 ChangeByte- 00:10:25.495 [2024-07-24 13:22:44.215310] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6076574517156926548 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.495 [2024-07-24 13:22:44.215353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:25.495 [2024-07-24 13:22:44.215404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.495 [2024-07-24 13:22:44.215432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:25.495 [2024-07-24 13:22:44.215479] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.495 [2024-07-24 13:22:44.215504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:25.495 [2024-07-24 13:22:44.215550] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.495 [2024-07-24 13:22:44.215576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:25.495 #28 NEW cov: 11822 ft: 13786 corp: 17/1472b lim: 100 exec/s: 28 rss: 70Mb L: 95/100 MS: 1 CopyPart- 00:10:25.495 [2024-07-24 13:22:44.285551] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6076574517156926548 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.495 [2024-07-24 13:22:44.285595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:25.495 [2024-07-24 13:22:44.285644] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.495 [2024-07-24 13:22:44.285673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:25.495 [2024-07-24 13:22:44.285722] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.495 [2024-07-24 13:22:44.285747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:25.495 [2024-07-24 13:22:44.285790] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6076574518398440532 len:21691 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.495 [2024-07-24 13:22:44.285815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:25.495 [2024-07-24 13:22:44.285860] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:6076574157621187668 len:85 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.495 [2024-07-24 13:22:44.285886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:10:25.754 #29 NEW cov: 11822 ft: 13834 corp: 18/1572b lim: 100 exec/s: 29 rss: 70Mb L: 100/100 MS: 1 ChangeByte- 00:10:25.754 [2024-07-24 13:22:44.375761] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6076574517156926548 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.754 [2024-07-24 13:22:44.375810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:25.754 [2024-07-24 13:22:44.375862] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6076574518398440532 len:257 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.754 [2024-07-24 13:22:44.375889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:25.754 [2024-07-24 13:22:44.375936] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.754 [2024-07-24 13:22:44.375962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:25.754 [2024-07-24 13:22:44.376007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.754 [2024-07-24 13:22:44.376035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:25.754 #30 NEW cov: 11822 ft: 13914 corp: 19/1667b lim: 100 exec/s: 30 rss: 70Mb L: 95/100 MS: 1 ShuffleBytes- 00:10:25.755 [2024-07-24 13:22:44.465875] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6076574517156926548 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.755 [2024-07-24 13:22:44.465919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:25.755 [2024-07-24 13:22:44.465970] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.755 [2024-07-24 13:22:44.465998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:25.755 [2024-07-24 13:22:44.466045] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.755 [2024-07-24 13:22:44.466070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:25.755 #31 NEW cov: 11829 ft: 13922 corp: 20/1736b lim: 100 exec/s: 31 rss: 70Mb L: 69/100 MS: 1 CMP- DE: "\366\377\377\377"- 00:10:25.755 [2024-07-24 13:22:44.566244] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6076574517156926548 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.755 [2024-07-24 13:22:44.566289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:25.755 [2024-07-24 13:22:44.566340] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.755 [2024-07-24 13:22:44.566368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:25.755 [2024-07-24 13:22:44.566416] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.755 [2024-07-24 13:22:44.566442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:25.755 [2024-07-24 13:22:44.566486] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:25.755 [2024-07-24 13:22:44.566512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:26.014 #32 NEW cov: 11829 ft: 14018 corp: 21/1832b lim: 100 exec/s: 32 rss: 70Mb L: 96/100 MS: 1 InsertByte- 00:10:26.014 [2024-07-24 13:22:44.656532] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6076574517156926548 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:26.014 [2024-07-24 13:22:44.656584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:26.014 [2024-07-24 13:22:44.656636] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6076574518398440532 len:257 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:26.014 [2024-07-24 13:22:44.656664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:26.014 [2024-07-24 13:22:44.656711] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:26.014 [2024-07-24 13:22:44.656737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:26.014 [2024-07-24 13:22:44.656782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6076574518398440532 len:21589 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:26.014 [2024-07-24 13:22:44.656808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:26.014 #33 NEW cov: 11829 ft: 14032 corp: 22/1927b lim: 100 exec/s: 16 rss: 70Mb L: 95/100 MS: 1 ChangeByte- 00:10:26.014 #33 DONE cov: 11829 ft: 14032 corp: 22/1927b lim: 100 exec/s: 16 rss: 70Mb 00:10:26.014 ###### Recommended dictionary. ###### 00:10:26.014 "\001\000\000\000\000\000\000\000" # Uses: 0 00:10:26.014 "\366\377\377\377" # Uses: 0 00:10:26.014 ###### End of recommended dictionary. ###### 00:10:26.014 Done 33 runs in 2 second(s) 00:10:26.014 13:22:44 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:10:26.014 13:22:44 -- ../common.sh@72 -- # (( i++ )) 00:10:26.014 13:22:44 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:26.014 13:22:44 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:10:26.014 00:10:26.014 real 1m6.096s 00:10:26.014 user 1m37.774s 00:10:26.014 sys 0m9.655s 00:10:26.014 13:22:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:26.014 13:22:44 -- common/autotest_common.sh@10 -- # set +x 00:10:26.014 ************************************ 00:10:26.014 END TEST nvmf_fuzz 00:10:26.014 ************************************ 00:10:26.275 13:22:44 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:10:26.275 13:22:44 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:10:26.275 13:22:44 -- fuzz/llvm.sh@63 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:10:26.275 13:22:44 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:10:26.275 13:22:44 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:10:26.275 13:22:44 -- common/autotest_common.sh@10 -- # set +x 00:10:26.275 ************************************ 00:10:26.275 START TEST vfio_fuzz 00:10:26.275 ************************************ 00:10:26.275 13:22:44 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:10:26.275 * Looking for test storage... 00:10:26.275 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:10:26.275 13:22:45 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:10:26.275 13:22:45 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:10:26.275 13:22:45 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:10:26.275 13:22:45 -- common/autotest_common.sh@34 -- # set -e 00:10:26.275 13:22:45 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:10:26.275 13:22:45 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:10:26.275 13:22:45 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:10:26.275 13:22:45 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:10:26.275 13:22:45 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:10:26.275 13:22:45 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:10:26.275 13:22:45 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:10:26.275 13:22:45 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:10:26.275 13:22:45 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:10:26.275 13:22:45 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:10:26.275 13:22:45 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:10:26.275 13:22:45 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:10:26.275 13:22:45 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:10:26.275 13:22:45 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:10:26.275 13:22:45 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:10:26.275 13:22:45 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:10:26.275 13:22:45 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:10:26.275 13:22:45 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:10:26.275 13:22:45 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:10:26.275 13:22:45 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:10:26.275 13:22:45 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:10:26.275 13:22:45 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:10:26.275 13:22:45 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:10:26.275 13:22:45 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:10:26.275 13:22:45 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:10:26.275 13:22:45 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:10:26.275 13:22:45 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:10:26.275 13:22:45 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:10:26.275 13:22:45 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:10:26.275 13:22:45 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:10:26.275 13:22:45 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:10:26.275 13:22:45 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:10:26.275 13:22:45 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:10:26.275 13:22:45 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:10:26.275 13:22:45 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:10:26.275 13:22:45 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:10:26.275 13:22:45 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:10:26.275 13:22:45 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:10:26.275 13:22:45 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:10:26.275 13:22:45 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:10:26.275 13:22:45 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:10:26.275 13:22:45 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:10:26.275 13:22:45 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:10:26.275 13:22:45 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:10:26.275 13:22:45 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:10:26.275 13:22:45 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:10:26.275 13:22:45 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:10:26.275 13:22:45 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:10:26.275 13:22:45 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:10:26.275 13:22:45 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:10:26.275 13:22:45 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:10:26.275 13:22:45 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:10:26.275 13:22:45 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:10:26.275 13:22:45 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:10:26.275 13:22:45 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:10:26.275 13:22:45 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:10:26.275 13:22:45 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:10:26.275 13:22:45 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:10:26.275 13:22:45 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:10:26.275 13:22:45 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:10:26.275 13:22:45 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:10:26.275 13:22:45 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:10:26.275 13:22:45 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:10:26.275 13:22:45 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:10:26.275 13:22:45 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:10:26.275 13:22:45 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:10:26.275 13:22:45 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:10:26.275 13:22:45 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:10:26.275 13:22:45 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:10:26.275 13:22:45 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:10:26.275 13:22:45 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:10:26.275 13:22:45 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:10:26.275 13:22:45 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:10:26.275 13:22:45 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:10:26.275 13:22:45 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:10:26.275 13:22:45 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:10:26.275 13:22:45 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:10:26.275 13:22:45 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:10:26.275 13:22:45 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:10:26.275 13:22:45 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:10:26.275 13:22:45 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:10:26.275 13:22:45 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:10:26.275 13:22:45 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:10:26.275 13:22:45 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:10:26.275 13:22:45 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:10:26.275 13:22:45 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:10:26.275 13:22:45 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:10:26.275 13:22:45 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:26.275 13:22:45 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:10:26.275 13:22:45 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:10:26.275 13:22:45 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:10:26.275 13:22:45 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:10:26.275 13:22:45 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:10:26.275 13:22:45 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:10:26.275 13:22:45 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:10:26.275 13:22:45 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:10:26.275 13:22:45 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:10:26.275 13:22:45 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:10:26.275 13:22:45 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:10:26.275 #define SPDK_CONFIG_H 00:10:26.275 #define SPDK_CONFIG_APPS 1 00:10:26.275 #define SPDK_CONFIG_ARCH native 00:10:26.275 #undef SPDK_CONFIG_ASAN 00:10:26.275 #undef SPDK_CONFIG_AVAHI 00:10:26.275 #undef SPDK_CONFIG_CET 00:10:26.275 #define SPDK_CONFIG_COVERAGE 1 00:10:26.275 #define SPDK_CONFIG_CROSS_PREFIX 00:10:26.275 #undef SPDK_CONFIG_CRYPTO 00:10:26.275 #undef SPDK_CONFIG_CRYPTO_MLX5 00:10:26.275 #undef SPDK_CONFIG_CUSTOMOCF 00:10:26.275 #undef SPDK_CONFIG_DAOS 00:10:26.275 #define SPDK_CONFIG_DAOS_DIR 00:10:26.275 #define SPDK_CONFIG_DEBUG 1 00:10:26.275 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:10:26.275 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:10:26.275 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:10:26.275 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:10:26.275 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:10:26.275 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:10:26.275 #define SPDK_CONFIG_EXAMPLES 1 00:10:26.276 #undef SPDK_CONFIG_FC 00:10:26.276 #define SPDK_CONFIG_FC_PATH 00:10:26.276 #define SPDK_CONFIG_FIO_PLUGIN 1 00:10:26.276 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:10:26.276 #undef SPDK_CONFIG_FUSE 00:10:26.276 #define SPDK_CONFIG_FUZZER 1 00:10:26.276 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:10:26.276 #undef SPDK_CONFIG_GOLANG 00:10:26.276 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:10:26.276 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:10:26.276 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:10:26.276 #undef SPDK_CONFIG_HAVE_LIBBSD 00:10:26.276 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:10:26.276 #define SPDK_CONFIG_IDXD 1 00:10:26.276 #define SPDK_CONFIG_IDXD_KERNEL 1 00:10:26.276 #undef SPDK_CONFIG_IPSEC_MB 00:10:26.276 #define SPDK_CONFIG_IPSEC_MB_DIR 00:10:26.276 #define SPDK_CONFIG_ISAL 1 00:10:26.276 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:10:26.276 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:10:26.276 #define SPDK_CONFIG_LIBDIR 00:10:26.276 #undef SPDK_CONFIG_LTO 00:10:26.276 #define SPDK_CONFIG_MAX_LCORES 00:10:26.276 #define SPDK_CONFIG_NVME_CUSE 1 00:10:26.276 #undef SPDK_CONFIG_OCF 00:10:26.276 #define SPDK_CONFIG_OCF_PATH 00:10:26.276 #define SPDK_CONFIG_OPENSSL_PATH 00:10:26.276 #undef SPDK_CONFIG_PGO_CAPTURE 00:10:26.276 #undef SPDK_CONFIG_PGO_USE 00:10:26.276 #define SPDK_CONFIG_PREFIX /usr/local 00:10:26.276 #undef SPDK_CONFIG_RAID5F 00:10:26.276 #undef SPDK_CONFIG_RBD 00:10:26.276 #define SPDK_CONFIG_RDMA 1 00:10:26.276 #define SPDK_CONFIG_RDMA_PROV verbs 00:10:26.276 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:10:26.276 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:10:26.276 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:10:26.276 #undef SPDK_CONFIG_SHARED 00:10:26.276 #undef SPDK_CONFIG_SMA 00:10:26.276 #define SPDK_CONFIG_TESTS 1 00:10:26.276 #undef SPDK_CONFIG_TSAN 00:10:26.276 #define SPDK_CONFIG_UBLK 1 00:10:26.276 #define SPDK_CONFIG_UBSAN 1 00:10:26.276 #undef SPDK_CONFIG_UNIT_TESTS 00:10:26.276 #undef SPDK_CONFIG_URING 00:10:26.276 #define SPDK_CONFIG_URING_PATH 00:10:26.276 #undef SPDK_CONFIG_URING_ZNS 00:10:26.276 #undef SPDK_CONFIG_USDT 00:10:26.276 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:10:26.276 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:10:26.276 #define SPDK_CONFIG_VFIO_USER 1 00:10:26.276 #define SPDK_CONFIG_VFIO_USER_DIR 00:10:26.276 #define SPDK_CONFIG_VHOST 1 00:10:26.276 #define SPDK_CONFIG_VIRTIO 1 00:10:26.276 #undef SPDK_CONFIG_VTUNE 00:10:26.276 #define SPDK_CONFIG_VTUNE_DIR 00:10:26.276 #define SPDK_CONFIG_WERROR 1 00:10:26.276 #define SPDK_CONFIG_WPDK_DIR 00:10:26.276 #undef SPDK_CONFIG_XNVME 00:10:26.276 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:10:26.276 13:22:45 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:10:26.276 13:22:45 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:10:26.276 13:22:45 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:10:26.276 13:22:45 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:10:26.276 13:22:45 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:10:26.276 13:22:45 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:26.276 13:22:45 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:26.276 13:22:45 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:26.276 13:22:45 -- paths/export.sh@5 -- # export PATH 00:10:26.276 13:22:45 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:26.276 13:22:45 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:10:26.276 13:22:45 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:10:26.276 13:22:45 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:10:26.276 13:22:45 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:10:26.276 13:22:45 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:10:26.276 13:22:45 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:26.276 13:22:45 -- pm/common@16 -- # TEST_TAG=N/A 00:10:26.276 13:22:45 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:10:26.276 13:22:45 -- common/autotest_common.sh@52 -- # : 1 00:10:26.276 13:22:45 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:10:26.276 13:22:45 -- common/autotest_common.sh@56 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:10:26.276 13:22:45 -- common/autotest_common.sh@58 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:10:26.276 13:22:45 -- common/autotest_common.sh@60 -- # : 1 00:10:26.276 13:22:45 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:10:26.276 13:22:45 -- common/autotest_common.sh@62 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:10:26.276 13:22:45 -- common/autotest_common.sh@64 -- # : 00:10:26.276 13:22:45 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:10:26.276 13:22:45 -- common/autotest_common.sh@66 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:10:26.276 13:22:45 -- common/autotest_common.sh@68 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:10:26.276 13:22:45 -- common/autotest_common.sh@70 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:10:26.276 13:22:45 -- common/autotest_common.sh@72 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:10:26.276 13:22:45 -- common/autotest_common.sh@74 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:10:26.276 13:22:45 -- common/autotest_common.sh@76 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:10:26.276 13:22:45 -- common/autotest_common.sh@78 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:10:26.276 13:22:45 -- common/autotest_common.sh@80 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:10:26.276 13:22:45 -- common/autotest_common.sh@82 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:10:26.276 13:22:45 -- common/autotest_common.sh@84 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:10:26.276 13:22:45 -- common/autotest_common.sh@86 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:10:26.276 13:22:45 -- common/autotest_common.sh@88 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:10:26.276 13:22:45 -- common/autotest_common.sh@90 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:10:26.276 13:22:45 -- common/autotest_common.sh@92 -- # : 1 00:10:26.276 13:22:45 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:10:26.276 13:22:45 -- common/autotest_common.sh@94 -- # : 1 00:10:26.276 13:22:45 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:10:26.276 13:22:45 -- common/autotest_common.sh@96 -- # : rdma 00:10:26.276 13:22:45 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:10:26.276 13:22:45 -- common/autotest_common.sh@98 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:10:26.276 13:22:45 -- common/autotest_common.sh@100 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:10:26.276 13:22:45 -- common/autotest_common.sh@102 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:10:26.276 13:22:45 -- common/autotest_common.sh@104 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:10:26.276 13:22:45 -- common/autotest_common.sh@106 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:10:26.276 13:22:45 -- common/autotest_common.sh@108 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:10:26.276 13:22:45 -- common/autotest_common.sh@110 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:10:26.276 13:22:45 -- common/autotest_common.sh@112 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:10:26.276 13:22:45 -- common/autotest_common.sh@114 -- # : 0 00:10:26.276 13:22:45 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:10:26.276 13:22:45 -- common/autotest_common.sh@116 -- # : 1 00:10:26.276 13:22:45 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:10:26.276 13:22:45 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:10:26.276 13:22:45 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:10:26.277 13:22:45 -- common/autotest_common.sh@120 -- # : 0 00:10:26.277 13:22:45 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:10:26.277 13:22:45 -- common/autotest_common.sh@122 -- # : 0 00:10:26.277 13:22:45 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:10:26.277 13:22:45 -- common/autotest_common.sh@124 -- # : 0 00:10:26.277 13:22:45 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:10:26.277 13:22:45 -- common/autotest_common.sh@126 -- # : 0 00:10:26.277 13:22:45 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:10:26.277 13:22:45 -- common/autotest_common.sh@128 -- # : 0 00:10:26.277 13:22:45 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:10:26.277 13:22:45 -- common/autotest_common.sh@130 -- # : 0 00:10:26.277 13:22:45 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:10:26.277 13:22:45 -- common/autotest_common.sh@132 -- # : v23.11 00:10:26.277 13:22:45 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:10:26.277 13:22:45 -- common/autotest_common.sh@134 -- # : true 00:10:26.277 13:22:45 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:10:26.277 13:22:45 -- common/autotest_common.sh@136 -- # : 0 00:10:26.277 13:22:45 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:10:26.277 13:22:45 -- common/autotest_common.sh@138 -- # : 0 00:10:26.277 13:22:45 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:10:26.277 13:22:45 -- common/autotest_common.sh@140 -- # : 0 00:10:26.277 13:22:45 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:10:26.277 13:22:45 -- common/autotest_common.sh@142 -- # : 0 00:10:26.277 13:22:45 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:10:26.277 13:22:45 -- common/autotest_common.sh@144 -- # : 0 00:10:26.277 13:22:45 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:10:26.277 13:22:45 -- common/autotest_common.sh@146 -- # : 0 00:10:26.277 13:22:45 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:10:26.277 13:22:45 -- common/autotest_common.sh@148 -- # : 00:10:26.277 13:22:45 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:10:26.277 13:22:45 -- common/autotest_common.sh@150 -- # : 0 00:10:26.277 13:22:45 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:10:26.277 13:22:45 -- common/autotest_common.sh@152 -- # : 0 00:10:26.277 13:22:45 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:10:26.277 13:22:45 -- common/autotest_common.sh@154 -- # : 0 00:10:26.277 13:22:45 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:10:26.277 13:22:45 -- common/autotest_common.sh@156 -- # : 0 00:10:26.277 13:22:45 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:10:26.277 13:22:45 -- common/autotest_common.sh@158 -- # : 0 00:10:26.277 13:22:45 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:10:26.277 13:22:45 -- common/autotest_common.sh@160 -- # : 0 00:10:26.277 13:22:45 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:10:26.277 13:22:45 -- common/autotest_common.sh@163 -- # : 00:10:26.277 13:22:45 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:10:26.277 13:22:45 -- common/autotest_common.sh@165 -- # : 0 00:10:26.277 13:22:45 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:10:26.277 13:22:45 -- common/autotest_common.sh@167 -- # : 0 00:10:26.277 13:22:45 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:10:26.277 13:22:45 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:10:26.277 13:22:45 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:10:26.277 13:22:45 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:10:26.277 13:22:45 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:10:26.277 13:22:45 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:10:26.277 13:22:45 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:10:26.277 13:22:45 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:10:26.277 13:22:45 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:10:26.277 13:22:45 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:10:26.277 13:22:45 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:10:26.277 13:22:45 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:10:26.277 13:22:45 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:10:26.277 13:22:45 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:10:26.277 13:22:45 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:10:26.277 13:22:45 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:10:26.277 13:22:45 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:10:26.277 13:22:45 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:10:26.277 13:22:45 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:10:26.277 13:22:45 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:10:26.277 13:22:45 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:10:26.277 13:22:45 -- common/autotest_common.sh@196 -- # cat 00:10:26.277 13:22:45 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:10:26.277 13:22:45 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:10:26.277 13:22:45 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:10:26.277 13:22:45 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:10:26.277 13:22:45 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:10:26.277 13:22:45 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:10:26.277 13:22:45 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:10:26.277 13:22:45 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:10:26.277 13:22:45 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:10:26.277 13:22:45 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:10:26.277 13:22:45 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:10:26.277 13:22:45 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:10:26.277 13:22:45 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:10:26.277 13:22:45 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:10:26.277 13:22:45 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:10:26.277 13:22:45 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:10:26.277 13:22:45 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:10:26.277 13:22:45 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:10:26.277 13:22:45 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:10:26.277 13:22:45 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:10:26.277 13:22:45 -- common/autotest_common.sh@249 -- # export valgrind= 00:10:26.277 13:22:45 -- common/autotest_common.sh@249 -- # valgrind= 00:10:26.277 13:22:45 -- common/autotest_common.sh@255 -- # uname -s 00:10:26.277 13:22:45 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:10:26.277 13:22:45 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:10:26.277 13:22:45 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:10:26.277 13:22:45 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:10:26.277 13:22:45 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:10:26.277 13:22:45 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:10:26.277 13:22:45 -- common/autotest_common.sh@265 -- # MAKE=make 00:10:26.277 13:22:45 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j72 00:10:26.277 13:22:45 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:10:26.277 13:22:45 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:10:26.277 13:22:45 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:10:26.278 13:22:45 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:10:26.278 13:22:45 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:10:26.278 13:22:45 -- common/autotest_common.sh@309 -- # [[ -z 3176165 ]] 00:10:26.278 13:22:45 -- common/autotest_common.sh@309 -- # kill -0 3176165 00:10:26.278 13:22:45 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:10:26.278 13:22:45 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:10:26.278 13:22:45 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:10:26.278 13:22:45 -- common/autotest_common.sh@322 -- # local mount target_dir 00:10:26.278 13:22:45 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:10:26.278 13:22:45 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:10:26.278 13:22:45 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:10:26.278 13:22:45 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:10:26.278 13:22:45 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.n67jqx 00:10:26.278 13:22:45 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:10:26.278 13:22:45 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:10:26.278 13:22:45 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:10:26.278 13:22:45 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.n67jqx/tests/vfio /tmp/spdk.n67jqx 00:10:26.278 13:22:45 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:10:26.278 13:22:45 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:10:26.278 13:22:45 -- common/autotest_common.sh@318 -- # df -T 00:10:26.278 13:22:45 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:10:26.278 13:22:45 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:10:26.278 13:22:45 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:10:26.278 13:22:45 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:10:26.278 13:22:45 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:10:26.278 13:22:45 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:10:26.278 13:22:45 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:10:26.278 13:22:45 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:10:26.278 13:22:45 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:10:26.278 13:22:45 -- common/autotest_common.sh@353 -- # avails["$mount"]=893108224 00:10:26.278 13:22:45 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:10:26.278 13:22:45 -- common/autotest_common.sh@354 -- # uses["$mount"]=4391321600 00:10:26.278 13:22:45 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:10:26.278 13:22:45 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:10:26.278 13:22:45 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:10:26.278 13:22:45 -- common/autotest_common.sh@353 -- # avails["$mount"]=80270766080 00:10:26.278 13:22:45 -- common/autotest_common.sh@353 -- # sizes["$mount"]=94508572672 00:10:26.278 13:22:45 -- common/autotest_common.sh@354 -- # uses["$mount"]=14237806592 00:10:26.278 13:22:45 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:10:26.278 13:22:45 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:10:26.278 13:22:45 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:10:26.278 13:22:45 -- common/autotest_common.sh@353 -- # avails["$mount"]=47200768000 00:10:26.278 13:22:45 -- common/autotest_common.sh@353 -- # sizes["$mount"]=47254286336 00:10:26.278 13:22:45 -- common/autotest_common.sh@354 -- # uses["$mount"]=53518336 00:10:26.278 13:22:45 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:10:26.278 13:22:45 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:10:26.278 13:22:45 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:10:26.278 13:22:45 -- common/autotest_common.sh@353 -- # avails["$mount"]=18895626240 00:10:26.278 13:22:45 -- common/autotest_common.sh@353 -- # sizes["$mount"]=18901716992 00:10:26.278 13:22:45 -- common/autotest_common.sh@354 -- # uses["$mount"]=6090752 00:10:26.278 13:22:45 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:10:26.278 13:22:45 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:10:26.278 13:22:45 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:10:26.278 13:22:45 -- common/autotest_common.sh@353 -- # avails["$mount"]=47252979712 00:10:26.278 13:22:45 -- common/autotest_common.sh@353 -- # sizes["$mount"]=47254286336 00:10:26.278 13:22:45 -- common/autotest_common.sh@354 -- # uses["$mount"]=1306624 00:10:26.278 13:22:45 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:10:26.278 13:22:45 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:10:26.278 13:22:45 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:10:26.278 13:22:45 -- common/autotest_common.sh@353 -- # avails["$mount"]=9450852352 00:10:26.278 13:22:45 -- common/autotest_common.sh@353 -- # sizes["$mount"]=9450856448 00:10:26.278 13:22:45 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:10:26.278 13:22:45 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:10:26.278 13:22:45 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:10:26.278 * Looking for test storage... 00:10:26.278 13:22:45 -- common/autotest_common.sh@359 -- # local target_space new_size 00:10:26.278 13:22:45 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:10:26.278 13:22:45 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:10:26.278 13:22:45 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:10:26.537 13:22:45 -- common/autotest_common.sh@363 -- # mount=/ 00:10:26.537 13:22:45 -- common/autotest_common.sh@365 -- # target_space=80270766080 00:10:26.537 13:22:45 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:10:26.537 13:22:45 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:10:26.537 13:22:45 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:10:26.537 13:22:45 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:10:26.537 13:22:45 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:10:26.537 13:22:45 -- common/autotest_common.sh@372 -- # new_size=16452399104 00:10:26.537 13:22:45 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:10:26.537 13:22:45 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:10:26.537 13:22:45 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:10:26.537 13:22:45 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:10:26.537 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:10:26.537 13:22:45 -- common/autotest_common.sh@380 -- # return 0 00:10:26.537 13:22:45 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:10:26.537 13:22:45 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:10:26.537 13:22:45 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:10:26.537 13:22:45 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:10:26.537 13:22:45 -- common/autotest_common.sh@1672 -- # true 00:10:26.537 13:22:45 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:10:26.537 13:22:45 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:10:26.537 13:22:45 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:10:26.537 13:22:45 -- common/autotest_common.sh@27 -- # exec 00:10:26.537 13:22:45 -- common/autotest_common.sh@29 -- # exec 00:10:26.537 13:22:45 -- common/autotest_common.sh@31 -- # xtrace_restore 00:10:26.537 13:22:45 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:10:26.537 13:22:45 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:10:26.537 13:22:45 -- common/autotest_common.sh@18 -- # set -x 00:10:26.537 13:22:45 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:10:26.537 13:22:45 -- ../common.sh@8 -- # pids=() 00:10:26.537 13:22:45 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:10:26.537 13:22:45 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:10:26.537 13:22:45 -- vfio/run.sh@59 -- # fuzz_num=7 00:10:26.537 13:22:45 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:10:26.537 13:22:45 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:10:26.537 13:22:45 -- vfio/run.sh@65 -- # mem_size=0 00:10:26.537 13:22:45 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:10:26.537 13:22:45 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:10:26.537 13:22:45 -- ../common.sh@69 -- # local fuzz_num=7 00:10:26.537 13:22:45 -- ../common.sh@70 -- # local time=1 00:10:26.537 13:22:45 -- ../common.sh@72 -- # (( i = 0 )) 00:10:26.537 13:22:45 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:26.538 13:22:45 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:10:26.538 13:22:45 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:10:26.538 13:22:45 -- vfio/run.sh@23 -- # local timen=1 00:10:26.538 13:22:45 -- vfio/run.sh@24 -- # local core=0x1 00:10:26.538 13:22:45 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:10:26.538 13:22:45 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:10:26.538 13:22:45 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:10:26.538 13:22:45 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:10:26.538 13:22:45 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:10:26.538 13:22:45 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:10:26.538 13:22:45 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:10:26.538 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:10:26.538 13:22:45 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:10:26.538 [2024-07-24 13:22:45.199388] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:10:26.538 [2024-07-24 13:22:45.199477] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3176202 ] 00:10:26.538 EAL: No free 2048 kB hugepages reported on node 1 00:10:26.538 [2024-07-24 13:22:45.312583] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:26.538 [2024-07-24 13:22:45.357907] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:26.538 [2024-07-24 13:22:45.358104] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:26.797 INFO: Running with entropic power schedule (0xFF, 100). 00:10:26.797 INFO: Seed: 1731884316 00:10:26.797 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x266dfcc, 0x26c0a63), 00:10:26.797 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x26c0a68,0x2beb3d8), 00:10:26.797 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:10:26.797 INFO: A corpus is not provided, starting from an empty corpus 00:10:26.797 #2 INITED exec/s: 0 rss: 62Mb 00:10:26.797 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:26.797 This may also happen if the target rejected all inputs we tried so far 00:10:27.624 NEW_FUNC[1/625]: 0x49e0e0 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:10:27.624 NEW_FUNC[2/625]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:10:27.624 #5 NEW cov: 10654 ft: 10557 corp: 2/23b lim: 60 exec/s: 0 rss: 68Mb L: 22/22 MS: 3 CrossOver-ShuffleBytes-InsertRepeatedBytes- 00:10:27.887 NEW_FUNC[1/7]: 0x1948490 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:27.887 NEW_FUNC[2/7]: 0x1c5bd90 in timed_pollers_tree_RB_NEXT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:322 00:10:27.887 #8 NEW cov: 10747 ft: 14234 corp: 3/65b lim: 60 exec/s: 8 rss: 69Mb L: 42/42 MS: 3 ChangeByte-CopyPart-InsertRepeatedBytes- 00:10:28.215 #9 NEW cov: 10747 ft: 15175 corp: 4/87b lim: 60 exec/s: 9 rss: 70Mb L: 22/42 MS: 1 ChangeBinInt- 00:10:28.474 #10 NEW cov: 10747 ft: 15712 corp: 5/105b lim: 60 exec/s: 10 rss: 70Mb L: 18/42 MS: 1 EraseBytes- 00:10:28.733 #11 NEW cov: 10754 ft: 15775 corp: 6/147b lim: 60 exec/s: 11 rss: 70Mb L: 42/42 MS: 1 ChangeByte- 00:10:28.993 #12 NEW cov: 10754 ft: 16211 corp: 7/181b lim: 60 exec/s: 6 rss: 70Mb L: 34/42 MS: 1 CopyPart- 00:10:28.993 #12 DONE cov: 10754 ft: 16211 corp: 7/181b lim: 60 exec/s: 6 rss: 70Mb 00:10:28.993 Done 12 runs in 2 second(s) 00:10:29.253 13:22:47 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:10:29.253 13:22:47 -- ../common.sh@72 -- # (( i++ )) 00:10:29.253 13:22:47 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:29.253 13:22:47 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:10:29.253 13:22:47 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:10:29.253 13:22:47 -- vfio/run.sh@23 -- # local timen=1 00:10:29.253 13:22:47 -- vfio/run.sh@24 -- # local core=0x1 00:10:29.253 13:22:47 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:10:29.253 13:22:47 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:10:29.253 13:22:47 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:10:29.253 13:22:47 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:10:29.253 13:22:47 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:10:29.253 13:22:47 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:10:29.253 13:22:47 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:10:29.253 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:10:29.253 13:22:47 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:10:29.253 [2024-07-24 13:22:48.021537] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:10:29.253 [2024-07-24 13:22:48.021625] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3176575 ] 00:10:29.253 EAL: No free 2048 kB hugepages reported on node 1 00:10:29.512 [2024-07-24 13:22:48.149969] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:29.512 [2024-07-24 13:22:48.197229] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:29.512 [2024-07-24 13:22:48.197427] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:29.771 INFO: Running with entropic power schedule (0xFF, 100). 00:10:29.771 INFO: Seed: 276917898 00:10:29.771 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x266dfcc, 0x26c0a63), 00:10:29.771 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x26c0a68,0x2beb3d8), 00:10:29.771 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:10:29.771 INFO: A corpus is not provided, starting from an empty corpus 00:10:29.771 #2 INITED exec/s: 0 rss: 61Mb 00:10:29.771 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:29.771 This may also happen if the target rejected all inputs we tried so far 00:10:29.771 [2024-07-24 13:22:48.527246] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:29.771 [2024-07-24 13:22:48.527297] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:29.771 [2024-07-24 13:22:48.527326] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:30.288 NEW_FUNC[1/638]: 0x49e680 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:10:30.288 NEW_FUNC[2/638]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:10:30.288 #13 NEW cov: 10726 ft: 10687 corp: 2/32b lim: 40 exec/s: 0 rss: 69Mb L: 31/31 MS: 1 InsertRepeatedBytes- 00:10:30.547 [2024-07-24 13:22:49.212366] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:30.547 [2024-07-24 13:22:49.212422] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:30.547 [2024-07-24 13:22:49.212449] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:30.547 NEW_FUNC[1/1]: 0x1948490 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:30.547 #14 NEW cov: 10762 ft: 13175 corp: 3/63b lim: 40 exec/s: 0 rss: 70Mb L: 31/31 MS: 1 ChangeBit- 00:10:30.806 [2024-07-24 13:22:49.459080] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:30.806 [2024-07-24 13:22:49.459112] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:30.806 [2024-07-24 13:22:49.459137] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:30.806 #18 NEW cov: 10762 ft: 13927 corp: 4/86b lim: 40 exec/s: 18 rss: 70Mb L: 23/31 MS: 4 ChangeBinInt-ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:10:31.064 [2024-07-24 13:22:49.713935] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:31.064 [2024-07-24 13:22:49.713967] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:31.064 [2024-07-24 13:22:49.713992] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:31.064 #21 NEW cov: 10762 ft: 14481 corp: 5/96b lim: 40 exec/s: 21 rss: 70Mb L: 10/31 MS: 3 InsertByte-ChangeBit-CMP- DE: "\000\000\000\000\004M\3165"- 00:10:31.323 [2024-07-24 13:22:49.959621] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:31.323 [2024-07-24 13:22:49.959662] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:31.323 [2024-07-24 13:22:49.959687] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:31.323 #22 NEW cov: 10762 ft: 14854 corp: 6/107b lim: 40 exec/s: 22 rss: 70Mb L: 11/31 MS: 1 InsertByte- 00:10:31.582 [2024-07-24 13:22:50.201082] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:31.582 [2024-07-24 13:22:50.201121] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:31.582 [2024-07-24 13:22:50.201145] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:31.582 #23 NEW cov: 10769 ft: 15133 corp: 7/138b lim: 40 exec/s: 23 rss: 70Mb L: 31/31 MS: 1 ChangeByte- 00:10:31.582 [2024-07-24 13:22:50.446566] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:31.582 [2024-07-24 13:22:50.446605] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:31.582 [2024-07-24 13:22:50.446630] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:31.840 #24 NEW cov: 10769 ft: 15355 corp: 8/170b lim: 40 exec/s: 12 rss: 71Mb L: 32/32 MS: 1 InsertByte- 00:10:31.840 #24 DONE cov: 10769 ft: 15355 corp: 8/170b lim: 40 exec/s: 12 rss: 71Mb 00:10:31.840 ###### Recommended dictionary. ###### 00:10:31.840 "\000\000\000\000\004M\3165" # Uses: 0 00:10:31.840 ###### End of recommended dictionary. ###### 00:10:31.840 Done 24 runs in 2 second(s) 00:10:32.099 13:22:50 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:10:32.099 13:22:50 -- ../common.sh@72 -- # (( i++ )) 00:10:32.099 13:22:50 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:32.099 13:22:50 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:10:32.099 13:22:50 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:10:32.099 13:22:50 -- vfio/run.sh@23 -- # local timen=1 00:10:32.099 13:22:50 -- vfio/run.sh@24 -- # local core=0x1 00:10:32.099 13:22:50 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:10:32.099 13:22:50 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:10:32.099 13:22:50 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:10:32.099 13:22:50 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:10:32.099 13:22:50 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:10:32.099 13:22:50 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:10:32.099 13:22:50 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:10:32.099 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:10:32.099 13:22:50 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:10:32.099 [2024-07-24 13:22:50.919575] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:10:32.099 [2024-07-24 13:22:50.919651] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3176943 ] 00:10:32.359 EAL: No free 2048 kB hugepages reported on node 1 00:10:32.359 [2024-07-24 13:22:51.046104] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:32.359 [2024-07-24 13:22:51.093020] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:32.359 [2024-07-24 13:22:51.093225] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:32.617 INFO: Running with entropic power schedule (0xFF, 100). 00:10:32.617 INFO: Seed: 3175912575 00:10:32.617 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x266dfcc, 0x26c0a63), 00:10:32.617 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x26c0a68,0x2beb3d8), 00:10:32.617 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:10:32.617 INFO: A corpus is not provided, starting from an empty corpus 00:10:32.617 #2 INITED exec/s: 0 rss: 61Mb 00:10:32.617 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:32.617 This may also happen if the target rejected all inputs we tried so far 00:10:32.617 [2024-07-24 13:22:51.415695] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:33.134 NEW_FUNC[1/636]: 0x49f060 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:10:33.134 NEW_FUNC[2/636]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:10:33.134 #12 NEW cov: 10711 ft: 10656 corp: 2/77b lim: 80 exec/s: 0 rss: 68Mb L: 76/76 MS: 5 CrossOver-CrossOver-EraseBytes-ChangeByte-InsertRepeatedBytes- 00:10:33.393 [2024-07-24 13:22:52.041370] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:10:33.393 [2024-07-24 13:22:52.041429] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:10:33.393 NEW_FUNC[1/3]: 0x134fdd0 in endpoint_id /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:638 00:10:33.393 NEW_FUNC[2/3]: 0x1350060 in vfio_user_log /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:3084 00:10:33.393 #14 NEW cov: 10755 ft: 13569 corp: 3/93b lim: 80 exec/s: 0 rss: 70Mb L: 16/76 MS: 2 ChangeBit-InsertRepeatedBytes- 00:10:33.393 [2024-07-24 13:22:52.250237] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:33.652 #15 NEW cov: 10755 ft: 15057 corp: 4/135b lim: 80 exec/s: 15 rss: 70Mb L: 42/76 MS: 1 InsertRepeatedBytes- 00:10:33.652 [2024-07-24 13:22:52.447024] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:33.910 #16 NEW cov: 10755 ft: 15565 corp: 5/177b lim: 80 exec/s: 16 rss: 70Mb L: 42/76 MS: 1 ChangeBit- 00:10:33.910 [2024-07-24 13:22:52.646219] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:33.910 #17 NEW cov: 10755 ft: 15813 corp: 6/201b lim: 80 exec/s: 17 rss: 70Mb L: 24/76 MS: 1 CMP- DE: "\377\377\377\377\004\255\220\313"- 00:10:34.169 [2024-07-24 13:22:52.846902] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:34.169 #18 NEW cov: 10755 ft: 16217 corp: 7/251b lim: 80 exec/s: 18 rss: 70Mb L: 50/76 MS: 1 InsertRepeatedBytes- 00:10:34.427 [2024-07-24 13:22:53.047351] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:34.427 #19 NEW cov: 10762 ft: 16311 corp: 8/293b lim: 80 exec/s: 19 rss: 71Mb L: 42/76 MS: 1 PersAutoDict- DE: "\377\377\377\377\004\255\220\313"- 00:10:34.427 [2024-07-24 13:22:53.246898] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:34.685 #20 NEW cov: 10762 ft: 16762 corp: 9/335b lim: 80 exec/s: 10 rss: 71Mb L: 42/76 MS: 1 ChangeByte- 00:10:34.685 #20 DONE cov: 10762 ft: 16762 corp: 9/335b lim: 80 exec/s: 10 rss: 71Mb 00:10:34.685 ###### Recommended dictionary. ###### 00:10:34.685 "\377\377\377\377\004\255\220\313" # Uses: 1 00:10:34.685 ###### End of recommended dictionary. ###### 00:10:34.685 Done 20 runs in 2 second(s) 00:10:34.944 13:22:53 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:10:34.944 13:22:53 -- ../common.sh@72 -- # (( i++ )) 00:10:34.944 13:22:53 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:34.944 13:22:53 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:10:34.944 13:22:53 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:10:34.944 13:22:53 -- vfio/run.sh@23 -- # local timen=1 00:10:34.944 13:22:53 -- vfio/run.sh@24 -- # local core=0x1 00:10:34.944 13:22:53 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:10:34.944 13:22:53 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:10:34.944 13:22:53 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:10:34.944 13:22:53 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:10:34.944 13:22:53 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:10:34.944 13:22:53 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:10:34.944 13:22:53 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:10:34.944 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:10:34.944 13:22:53 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:10:34.944 [2024-07-24 13:22:53.704150] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:10:34.944 [2024-07-24 13:22:53.704228] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3177314 ] 00:10:34.944 EAL: No free 2048 kB hugepages reported on node 1 00:10:35.203 [2024-07-24 13:22:53.832173] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:35.203 [2024-07-24 13:22:53.881380] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:35.203 [2024-07-24 13:22:53.881583] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:35.461 INFO: Running with entropic power schedule (0xFF, 100). 00:10:35.461 INFO: Seed: 1667945421 00:10:35.461 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x266dfcc, 0x26c0a63), 00:10:35.461 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x26c0a68,0x2beb3d8), 00:10:35.461 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:10:35.461 INFO: A corpus is not provided, starting from an empty corpus 00:10:35.461 #2 INITED exec/s: 0 rss: 61Mb 00:10:35.461 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:35.461 This may also happen if the target rejected all inputs we tried so far 00:10:35.978 NEW_FUNC[1/632]: 0x49f740 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:10:35.978 NEW_FUNC[2/632]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:10:35.978 #5 NEW cov: 10696 ft: 10671 corp: 2/98b lim: 320 exec/s: 0 rss: 69Mb L: 97/97 MS: 3 ShuffleBytes-InsertRepeatedBytes-InsertRepeatedBytes- 00:10:36.236 #6 NEW cov: 10714 ft: 13325 corp: 3/195b lim: 320 exec/s: 0 rss: 70Mb L: 97/97 MS: 1 ChangeBit- 00:10:36.494 NEW_FUNC[1/1]: 0x1948490 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:36.494 #7 NEW cov: 10734 ft: 13452 corp: 4/292b lim: 320 exec/s: 7 rss: 70Mb L: 97/97 MS: 1 ChangeByte- 00:10:36.494 #13 NEW cov: 10734 ft: 13660 corp: 5/389b lim: 320 exec/s: 13 rss: 70Mb L: 97/97 MS: 1 ChangeBinInt- 00:10:36.752 #14 NEW cov: 10734 ft: 14268 corp: 6/486b lim: 320 exec/s: 14 rss: 70Mb L: 97/97 MS: 1 CrossOver- 00:10:37.011 #20 NEW cov: 10734 ft: 14322 corp: 7/559b lim: 320 exec/s: 20 rss: 70Mb L: 73/97 MS: 1 EraseBytes- 00:10:37.269 #27 NEW cov: 10741 ft: 14960 corp: 8/686b lim: 320 exec/s: 27 rss: 71Mb L: 127/127 MS: 2 ChangeBit-InsertRepeatedBytes- 00:10:37.528 #28 NEW cov: 10741 ft: 15026 corp: 9/784b lim: 320 exec/s: 14 rss: 71Mb L: 98/127 MS: 1 InsertByte- 00:10:37.528 #28 DONE cov: 10741 ft: 15026 corp: 9/784b lim: 320 exec/s: 14 rss: 71Mb 00:10:37.528 Done 28 runs in 2 second(s) 00:10:37.787 13:22:56 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:10:37.787 13:22:56 -- ../common.sh@72 -- # (( i++ )) 00:10:37.787 13:22:56 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:37.787 13:22:56 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:10:37.787 13:22:56 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:10:37.787 13:22:56 -- vfio/run.sh@23 -- # local timen=1 00:10:37.787 13:22:56 -- vfio/run.sh@24 -- # local core=0x1 00:10:37.787 13:22:56 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:10:37.787 13:22:56 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:10:37.787 13:22:56 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:10:37.787 13:22:56 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:10:37.787 13:22:56 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:10:37.787 13:22:56 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:10:37.787 13:22:56 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:10:37.787 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:10:37.787 13:22:56 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:10:37.787 [2024-07-24 13:22:56.638779] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:10:37.787 [2024-07-24 13:22:56.638851] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3177679 ] 00:10:38.046 EAL: No free 2048 kB hugepages reported on node 1 00:10:38.046 [2024-07-24 13:22:56.764994] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:38.046 [2024-07-24 13:22:56.813217] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:38.046 [2024-07-24 13:22:56.813418] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:38.304 INFO: Running with entropic power schedule (0xFF, 100). 00:10:38.304 INFO: Seed: 311991292 00:10:38.304 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x266dfcc, 0x26c0a63), 00:10:38.304 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x26c0a68,0x2beb3d8), 00:10:38.304 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:10:38.304 INFO: A corpus is not provided, starting from an empty corpus 00:10:38.304 #2 INITED exec/s: 0 rss: 61Mb 00:10:38.304 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:38.304 This may also happen if the target rejected all inputs we tried so far 00:10:38.821 NEW_FUNC[1/632]: 0x49ffc0 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:10:38.821 NEW_FUNC[2/632]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:10:38.821 #12 NEW cov: 10702 ft: 10233 corp: 2/119b lim: 320 exec/s: 0 rss: 68Mb L: 118/118 MS: 5 CrossOver-CrossOver-CrossOver-ChangeBit-InsertRepeatedBytes- 00:10:39.079 NEW_FUNC[1/1]: 0x1948490 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:39.079 #15 NEW cov: 10734 ft: 13490 corp: 3/219b lim: 320 exec/s: 0 rss: 69Mb L: 100/118 MS: 3 ChangeBinInt-CopyPart-InsertRepeatedBytes- 00:10:39.339 #16 NEW cov: 10736 ft: 14487 corp: 4/319b lim: 320 exec/s: 16 rss: 70Mb L: 100/118 MS: 1 ChangeBinInt- 00:10:39.598 #22 NEW cov: 10736 ft: 15588 corp: 5/437b lim: 320 exec/s: 22 rss: 70Mb L: 118/118 MS: 1 ChangeBinInt- 00:10:39.598 [2024-07-24 13:22:58.439224] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:10:39.598 [2024-07-24 13:22:58.439280] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:10:39.598 [2024-07-24 13:22:58.439298] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:10:39.598 [2024-07-24 13:22:58.439324] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:10:39.598 [2024-07-24 13:22:58.440201] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:10:39.598 [2024-07-24 13:22:58.440237] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:10:39.598 [2024-07-24 13:22:58.440262] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:10:39.857 NEW_FUNC[1/6]: 0x134fdd0 in endpoint_id /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:638 00:10:39.857 NEW_FUNC[2/6]: 0x1350060 in vfio_user_log /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:3084 00:10:39.857 #24 NEW cov: 10768 ft: 15767 corp: 6/519b lim: 320 exec/s: 24 rss: 70Mb L: 82/118 MS: 2 ChangeByte-InsertRepeatedBytes- 00:10:40.115 #25 NEW cov: 10775 ft: 16029 corp: 7/711b lim: 320 exec/s: 25 rss: 70Mb L: 192/192 MS: 1 InsertRepeatedBytes- 00:10:40.374 #34 NEW cov: 10775 ft: 16114 corp: 8/761b lim: 320 exec/s: 17 rss: 70Mb L: 50/192 MS: 4 CrossOver-ShuffleBytes-CrossOver-InsertRepeatedBytes- 00:10:40.374 #34 DONE cov: 10775 ft: 16114 corp: 8/761b lim: 320 exec/s: 17 rss: 70Mb 00:10:40.374 Done 34 runs in 2 second(s) 00:10:40.633 13:22:59 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:10:40.633 13:22:59 -- ../common.sh@72 -- # (( i++ )) 00:10:40.633 13:22:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:40.633 13:22:59 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:10:40.633 13:22:59 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:10:40.633 13:22:59 -- vfio/run.sh@23 -- # local timen=1 00:10:40.633 13:22:59 -- vfio/run.sh@24 -- # local core=0x1 00:10:40.633 13:22:59 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:10:40.633 13:22:59 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:10:40.633 13:22:59 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:10:40.633 13:22:59 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:10:40.633 13:22:59 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:10:40.633 13:22:59 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:10:40.633 13:22:59 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:10:40.633 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:10:40.633 13:22:59 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:10:40.633 [2024-07-24 13:22:59.410961] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:10:40.633 [2024-07-24 13:22:59.411033] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3178047 ] 00:10:40.633 EAL: No free 2048 kB hugepages reported on node 1 00:10:40.955 [2024-07-24 13:22:59.538711] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:40.955 [2024-07-24 13:22:59.587093] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:40.955 [2024-07-24 13:22:59.587304] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:40.955 INFO: Running with entropic power schedule (0xFF, 100). 00:10:40.955 INFO: Seed: 3084006436 00:10:41.214 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x266dfcc, 0x26c0a63), 00:10:41.214 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x26c0a68,0x2beb3d8), 00:10:41.214 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:10:41.214 INFO: A corpus is not provided, starting from an empty corpus 00:10:41.214 #2 INITED exec/s: 0 rss: 61Mb 00:10:41.214 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:41.214 This may also happen if the target rejected all inputs we tried so far 00:10:41.214 [2024-07-24 13:22:59.949411] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:41.214 [2024-07-24 13:22:59.949479] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:41.732 NEW_FUNC[1/638]: 0x4a09c0 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:10:41.732 NEW_FUNC[2/638]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:10:41.732 #6 NEW cov: 10730 ft: 10400 corp: 2/46b lim: 120 exec/s: 0 rss: 68Mb L: 45/45 MS: 4 ChangeBit-InsertByte-InsertByte-InsertRepeatedBytes- 00:10:42.078 [2024-07-24 13:23:00.629033] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:42.078 [2024-07-24 13:23:00.629095] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:42.078 NEW_FUNC[1/1]: 0x1948490 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:42.078 #7 NEW cov: 10764 ft: 13400 corp: 3/120b lim: 120 exec/s: 0 rss: 69Mb L: 74/74 MS: 1 InsertRepeatedBytes- 00:10:42.078 [2024-07-24 13:23:00.905932] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:42.078 [2024-07-24 13:23:00.905979] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:42.337 #8 NEW cov: 10764 ft: 15129 corp: 4/145b lim: 120 exec/s: 8 rss: 70Mb L: 25/74 MS: 1 EraseBytes- 00:10:42.337 [2024-07-24 13:23:01.160617] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:42.337 [2024-07-24 13:23:01.160663] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:42.596 #9 NEW cov: 10764 ft: 15421 corp: 5/159b lim: 120 exec/s: 9 rss: 70Mb L: 14/74 MS: 1 CrossOver- 00:10:42.596 [2024-07-24 13:23:01.415943] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:42.596 [2024-07-24 13:23:01.415986] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:42.855 #12 NEW cov: 10764 ft: 15580 corp: 6/260b lim: 120 exec/s: 12 rss: 70Mb L: 101/101 MS: 3 CopyPart-ChangeBit-InsertRepeatedBytes- 00:10:42.855 [2024-07-24 13:23:01.671497] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:42.855 [2024-07-24 13:23:01.671539] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:43.113 #15 NEW cov: 10771 ft: 15642 corp: 7/309b lim: 120 exec/s: 15 rss: 70Mb L: 49/101 MS: 3 CrossOver-InsertByte-InsertRepeatedBytes- 00:10:43.113 [2024-07-24 13:23:01.916802] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:43.113 [2024-07-24 13:23:01.916844] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:43.372 #16 pulse cov: 10771 ft: 16139 corp: 7/309b lim: 120 exec/s: 8 rss: 70Mb 00:10:43.372 #16 NEW cov: 10771 ft: 16139 corp: 8/383b lim: 120 exec/s: 8 rss: 70Mb L: 74/101 MS: 1 ShuffleBytes- 00:10:43.372 #16 DONE cov: 10771 ft: 16139 corp: 8/383b lim: 120 exec/s: 8 rss: 70Mb 00:10:43.372 Done 16 runs in 2 second(s) 00:10:43.631 13:23:02 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:10:43.631 13:23:02 -- ../common.sh@72 -- # (( i++ )) 00:10:43.631 13:23:02 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:43.631 13:23:02 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:10:43.631 13:23:02 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:10:43.631 13:23:02 -- vfio/run.sh@23 -- # local timen=1 00:10:43.631 13:23:02 -- vfio/run.sh@24 -- # local core=0x1 00:10:43.631 13:23:02 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:10:43.631 13:23:02 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:10:43.631 13:23:02 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:10:43.631 13:23:02 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:10:43.631 13:23:02 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:10:43.631 13:23:02 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:10:43.631 13:23:02 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:10:43.631 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:10:43.631 13:23:02 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:10:43.631 [2024-07-24 13:23:02.411118] Starting SPDK v24.01.1-pre git sha1 dbef7efac / DPDK 23.11.0 initialization... 00:10:43.631 [2024-07-24 13:23:02.411198] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3178441 ] 00:10:43.631 EAL: No free 2048 kB hugepages reported on node 1 00:10:43.890 [2024-07-24 13:23:02.538539] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:43.890 [2024-07-24 13:23:02.588527] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:43.890 [2024-07-24 13:23:02.588728] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:44.149 INFO: Running with entropic power schedule (0xFF, 100). 00:10:44.149 INFO: Seed: 1793016586 00:10:44.149 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x266dfcc, 0x26c0a63), 00:10:44.149 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x26c0a68,0x2beb3d8), 00:10:44.149 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:10:44.149 INFO: A corpus is not provided, starting from an empty corpus 00:10:44.149 #2 INITED exec/s: 0 rss: 61Mb 00:10:44.149 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:44.149 This may also happen if the target rejected all inputs we tried so far 00:10:44.149 [2024-07-24 13:23:02.912264] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:44.149 [2024-07-24 13:23:02.912329] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:44.671 NEW_FUNC[1/638]: 0x4a16b0 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:10:44.671 NEW_FUNC[2/638]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:10:44.671 #10 NEW cov: 10716 ft: 10689 corp: 2/90b lim: 90 exec/s: 0 rss: 68Mb L: 89/89 MS: 3 ShuffleBytes-InsertByte-InsertRepeatedBytes- 00:10:44.929 [2024-07-24 13:23:03.538036] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:44.929 [2024-07-24 13:23:03.538095] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:44.929 NEW_FUNC[1/1]: 0x1948490 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:44.929 #11 NEW cov: 10756 ft: 14752 corp: 3/179b lim: 90 exec/s: 0 rss: 69Mb L: 89/89 MS: 1 ShuffleBytes- 00:10:44.929 [2024-07-24 13:23:03.736725] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:44.929 [2024-07-24 13:23:03.736768] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:45.187 #12 NEW cov: 10756 ft: 15657 corp: 4/268b lim: 90 exec/s: 12 rss: 70Mb L: 89/89 MS: 1 ChangeByte- 00:10:45.188 [2024-07-24 13:23:03.926571] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:45.188 [2024-07-24 13:23:03.926614] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:45.446 #14 NEW cov: 10756 ft: 15880 corp: 5/356b lim: 90 exec/s: 14 rss: 70Mb L: 88/89 MS: 2 ChangeByte-CrossOver- 00:10:45.446 [2024-07-24 13:23:04.135368] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:45.446 [2024-07-24 13:23:04.135411] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:45.446 #15 NEW cov: 10756 ft: 16292 corp: 6/445b lim: 90 exec/s: 15 rss: 70Mb L: 89/89 MS: 1 ChangeByte- 00:10:45.704 [2024-07-24 13:23:04.322684] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:45.704 [2024-07-24 13:23:04.322723] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:45.704 #16 NEW cov: 10756 ft: 16858 corp: 7/511b lim: 90 exec/s: 16 rss: 70Mb L: 66/89 MS: 1 EraseBytes- 00:10:45.704 [2024-07-24 13:23:04.509757] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:45.704 [2024-07-24 13:23:04.509796] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:45.962 #17 NEW cov: 10763 ft: 17142 corp: 8/585b lim: 90 exec/s: 17 rss: 70Mb L: 74/89 MS: 1 CMP- DE: "\377\377\377\377\000\000\000\000"- 00:10:45.962 [2024-07-24 13:23:04.698155] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:45.962 [2024-07-24 13:23:04.698195] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:45.962 #19 NEW cov: 10763 ft: 17297 corp: 9/594b lim: 90 exec/s: 19 rss: 70Mb L: 9/89 MS: 2 ChangeBit-PersAutoDict- DE: "\377\377\377\377\000\000\000\000"- 00:10:46.221 [2024-07-24 13:23:04.888523] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:46.221 [2024-07-24 13:23:04.888564] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:46.221 #20 NEW cov: 10763 ft: 17340 corp: 10/660b lim: 90 exec/s: 10 rss: 70Mb L: 66/89 MS: 1 CrossOver- 00:10:46.221 #20 DONE cov: 10763 ft: 17340 corp: 10/660b lim: 90 exec/s: 10 rss: 70Mb 00:10:46.221 ###### Recommended dictionary. ###### 00:10:46.221 "\377\377\377\377\000\000\000\000" # Uses: 1 00:10:46.221 ###### End of recommended dictionary. ###### 00:10:46.221 Done 20 runs in 2 second(s) 00:10:46.480 13:23:05 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:10:46.480 13:23:05 -- ../common.sh@72 -- # (( i++ )) 00:10:46.480 13:23:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:46.480 13:23:05 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:10:46.480 00:10:46.480 real 0m20.404s 00:10:46.480 user 0m28.011s 00:10:46.480 sys 0m2.326s 00:10:46.480 13:23:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:46.480 13:23:05 -- common/autotest_common.sh@10 -- # set +x 00:10:46.480 ************************************ 00:10:46.480 END TEST vfio_fuzz 00:10:46.480 ************************************ 00:10:46.740 13:23:05 -- fuzz/llvm.sh@67 -- # [[ 1 -eq 0 ]] 00:10:46.740 00:10:46.740 real 1m26.710s 00:10:46.740 user 2m5.867s 00:10:46.740 sys 0m12.136s 00:10:46.740 13:23:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:46.740 13:23:05 -- common/autotest_common.sh@10 -- # set +x 00:10:46.740 ************************************ 00:10:46.740 END TEST llvm_fuzz 00:10:46.740 ************************************ 00:10:46.740 13:23:05 -- spdk/autotest.sh@378 -- # [[ 0 -eq 1 ]] 00:10:46.740 13:23:05 -- spdk/autotest.sh@383 -- # trap - SIGINT SIGTERM EXIT 00:10:46.740 13:23:05 -- spdk/autotest.sh@385 -- # timing_enter post_cleanup 00:10:46.740 13:23:05 -- common/autotest_common.sh@712 -- # xtrace_disable 00:10:46.740 13:23:05 -- common/autotest_common.sh@10 -- # set +x 00:10:46.740 13:23:05 -- spdk/autotest.sh@386 -- # autotest_cleanup 00:10:46.740 13:23:05 -- common/autotest_common.sh@1371 -- # local autotest_es=0 00:10:46.740 13:23:05 -- common/autotest_common.sh@1372 -- # xtrace_disable 00:10:46.740 13:23:05 -- common/autotest_common.sh@10 -- # set +x 00:10:52.016 INFO: APP EXITING 00:10:52.016 INFO: killing all VMs 00:10:52.016 INFO: killing vhost app 00:10:52.016 WARN: no vhost pid file found 00:10:52.016 INFO: EXIT DONE 00:10:55.307 Waiting for block devices as requested 00:10:55.307 0000:1a:00.0 (8086 0a54): vfio-pci -> nvme 00:10:55.307 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:10:55.307 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:10:55.307 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:10:55.307 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:10:55.566 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:10:55.566 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:10:55.566 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:10:55.825 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:10:55.825 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:10:55.825 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:10:56.084 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:10:56.084 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:10:56.084 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:10:56.343 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:10:56.343 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:10:56.343 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:11:02.914 Cleaning 00:11:02.914 Removing: /dev/shm/spdk_tgt_trace.pid3148371 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3145919 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3147068 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3148371 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3149025 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3149247 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3149488 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3149897 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3150137 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3150340 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3150536 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3150758 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3151355 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3153907 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3154220 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3154496 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3154547 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3155072 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3155171 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3155655 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3155835 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3156052 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3156226 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3156436 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3156451 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3156906 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3157102 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3157296 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3157528 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3157749 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3157769 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3157959 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3158153 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3158369 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3158555 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3158752 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3158932 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3159131 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3159311 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3159515 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3159695 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3159890 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3160077 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3160272 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3160450 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3160655 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3160837 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3161094 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3161341 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3161539 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3161951 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3162309 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3162495 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3162697 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3162891 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3163084 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3163270 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3163466 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3163646 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3163845 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3164028 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3164226 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3164423 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3164644 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3164851 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3165082 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3165289 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3165545 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3165723 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3165923 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3166105 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3166301 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3166372 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3166615 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3167157 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3167524 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3167892 00:11:02.914 Removing: /var/run/dpdk/spdk_pid3168258 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3168626 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3168989 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3169355 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3169720 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3170060 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3170374 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3170668 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3171023 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3171385 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3171755 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3172122 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3172486 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3172851 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3173221 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3173579 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3173945 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3174309 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3174679 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3175043 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3175407 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3175768 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3176202 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3176575 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3176943 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3177314 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3177679 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3178047 00:11:02.915 Removing: /var/run/dpdk/spdk_pid3178441 00:11:02.915 Clean 00:11:02.915 killing process with pid 3094364 00:11:04.817 killing process with pid 3094361 00:11:04.817 killing process with pid 3094363 00:11:05.104 killing process with pid 3094362 00:11:05.104 13:23:23 -- common/autotest_common.sh@1436 -- # return 0 00:11:05.104 13:23:23 -- spdk/autotest.sh@387 -- # timing_exit post_cleanup 00:11:05.104 13:23:23 -- common/autotest_common.sh@718 -- # xtrace_disable 00:11:05.104 13:23:23 -- common/autotest_common.sh@10 -- # set +x 00:11:05.104 13:23:23 -- spdk/autotest.sh@389 -- # timing_exit autotest 00:11:05.104 13:23:23 -- common/autotest_common.sh@718 -- # xtrace_disable 00:11:05.104 13:23:23 -- common/autotest_common.sh@10 -- # set +x 00:11:05.104 13:23:23 -- spdk/autotest.sh@390 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:11:05.104 13:23:23 -- spdk/autotest.sh@392 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:11:05.104 13:23:23 -- spdk/autotest.sh@392 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:11:05.104 13:23:23 -- spdk/autotest.sh@394 -- # hash lcov 00:11:05.104 13:23:23 -- spdk/autotest.sh@394 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:11:05.104 13:23:23 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:11:05.104 13:23:23 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:11:05.104 13:23:23 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:11:05.104 13:23:23 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:11:05.104 13:23:23 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:05.104 13:23:23 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:05.104 13:23:23 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:05.104 13:23:23 -- paths/export.sh@5 -- $ export PATH 00:11:05.104 13:23:23 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:05.104 13:23:23 -- common/autobuild_common.sh@437 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:11:05.104 13:23:23 -- common/autobuild_common.sh@438 -- $ date +%s 00:11:05.104 13:23:23 -- common/autobuild_common.sh@438 -- $ mktemp -dt spdk_1721820203.XXXXXX 00:11:05.104 13:23:23 -- common/autobuild_common.sh@438 -- $ SPDK_WORKSPACE=/tmp/spdk_1721820203.haJ8NN 00:11:05.104 13:23:23 -- common/autobuild_common.sh@440 -- $ [[ -n '' ]] 00:11:05.104 13:23:23 -- common/autobuild_common.sh@444 -- $ '[' -n v23.11 ']' 00:11:05.104 13:23:23 -- common/autobuild_common.sh@445 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:11:05.104 13:23:23 -- common/autobuild_common.sh@445 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:11:05.104 13:23:23 -- common/autobuild_common.sh@451 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:11:05.104 13:23:23 -- common/autobuild_common.sh@453 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:11:05.104 13:23:23 -- common/autobuild_common.sh@454 -- $ get_config_params 00:11:05.104 13:23:23 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:11:05.104 13:23:23 -- common/autotest_common.sh@10 -- $ set +x 00:11:05.380 13:23:23 -- common/autobuild_common.sh@454 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:11:05.380 13:23:23 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:11:05.380 13:23:23 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:11:05.380 13:23:23 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:11:05.380 13:23:23 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:11:05.380 13:23:23 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:11:05.380 13:23:23 -- spdk/autopackage.sh@19 -- $ timing_finish 00:11:05.380 13:23:23 -- common/autotest_common.sh@724 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:11:05.380 13:23:23 -- common/autotest_common.sh@725 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:11:05.380 13:23:23 -- common/autotest_common.sh@727 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:11:05.380 13:23:23 -- spdk/autopackage.sh@20 -- $ exit 0 00:11:05.380 + [[ -n 3038208 ]] 00:11:05.380 + sudo kill 3038208 00:11:05.389 [Pipeline] } 00:11:05.407 [Pipeline] // stage 00:11:05.412 [Pipeline] } 00:11:05.429 [Pipeline] // timeout 00:11:05.434 [Pipeline] } 00:11:05.452 [Pipeline] // catchError 00:11:05.459 [Pipeline] } 00:11:05.477 [Pipeline] // wrap 00:11:05.483 [Pipeline] } 00:11:05.498 [Pipeline] // catchError 00:11:05.506 [Pipeline] stage 00:11:05.508 [Pipeline] { (Epilogue) 00:11:05.519 [Pipeline] catchError 00:11:05.521 [Pipeline] { 00:11:05.534 [Pipeline] echo 00:11:05.536 Cleanup processes 00:11:05.541 [Pipeline] sh 00:11:05.824 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:11:05.824 3185782 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:11:05.836 [Pipeline] sh 00:11:06.117 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:11:06.118 ++ grep -v 'sudo pgrep' 00:11:06.118 ++ awk '{print $1}' 00:11:06.118 + sudo kill -9 00:11:06.118 + true 00:11:06.129 [Pipeline] sh 00:11:06.410 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:11:07.800 [Pipeline] sh 00:11:08.081 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:11:08.081 Artifacts sizes are good 00:11:08.096 [Pipeline] archiveArtifacts 00:11:08.103 Archiving artifacts 00:11:08.186 [Pipeline] sh 00:11:08.469 + sudo chown -R sys_sgci /var/jenkins/workspace/short-fuzz-phy-autotest 00:11:08.484 [Pipeline] cleanWs 00:11:08.494 [WS-CLEANUP] Deleting project workspace... 00:11:08.494 [WS-CLEANUP] Deferred wipeout is used... 00:11:08.500 [WS-CLEANUP] done 00:11:08.502 [Pipeline] } 00:11:08.522 [Pipeline] // catchError 00:11:08.536 [Pipeline] sh 00:11:08.818 + logger -p user.info -t JENKINS-CI 00:11:08.827 [Pipeline] } 00:11:08.846 [Pipeline] // stage 00:11:08.852 [Pipeline] } 00:11:08.870 [Pipeline] // node 00:11:08.876 [Pipeline] End of Pipeline 00:11:08.916 Finished: SUCCESS