00:00:00.002 Started by upstream project "autotest-spdk-master-vs-dpdk-v23.11" build number 1017 00:00:00.002 originally caused by: 00:00:00.003 Started by upstream project "nightly-trigger" build number 3679 00:00:00.003 originally caused by: 00:00:00.003 Started by timer 00:00:00.080 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.081 The recommended git tool is: git 00:00:00.081 using credential 00000000-0000-0000-0000-000000000002 00:00:00.083 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.105 Fetching changes from the remote Git repository 00:00:00.123 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.139 Using shallow fetch with depth 1 00:00:00.139 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.139 > git --version # timeout=10 00:00:00.158 > git --version # 'git version 2.39.2' 00:00:00.158 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.174 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.174 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:03.271 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:03.283 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:03.295 Checking out Revision db4637e8b949f278f369ec13f70585206ccd9507 (FETCH_HEAD) 00:00:03.295 > git config core.sparsecheckout # timeout=10 00:00:03.305 > git read-tree -mu HEAD # timeout=10 00:00:03.322 > git checkout -f db4637e8b949f278f369ec13f70585206ccd9507 # timeout=5 00:00:03.344 Commit message: "jenkins/jjb-config: Add missing SPDK_TEST_NVME_INTERRUPT flag" 00:00:03.344 > git rev-list --no-walk db4637e8b949f278f369ec13f70585206ccd9507 # timeout=10 00:00:03.454 [Pipeline] Start of Pipeline 00:00:03.465 [Pipeline] library 00:00:03.467 Loading library shm_lib@master 00:00:03.467 Library shm_lib@master is cached. Copying from home. 00:00:03.487 [Pipeline] node 00:00:03.515 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:03.516 [Pipeline] { 00:00:03.524 [Pipeline] catchError 00:00:03.525 [Pipeline] { 00:00:03.538 [Pipeline] wrap 00:00:03.547 [Pipeline] { 00:00:03.555 [Pipeline] stage 00:00:03.557 [Pipeline] { (Prologue) 00:00:03.758 [Pipeline] sh 00:00:04.037 + logger -p user.info -t JENKINS-CI 00:00:04.059 [Pipeline] echo 00:00:04.060 Node: WFP20 00:00:04.067 [Pipeline] sh 00:00:04.368 [Pipeline] setCustomBuildProperty 00:00:04.382 [Pipeline] echo 00:00:04.384 Cleanup processes 00:00:04.391 [Pipeline] sh 00:00:04.680 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.680 1603723 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.694 [Pipeline] sh 00:00:04.984 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.984 ++ grep -v 'sudo pgrep' 00:00:04.984 ++ awk '{print $1}' 00:00:04.984 + sudo kill -9 00:00:04.984 + true 00:00:04.995 [Pipeline] cleanWs 00:00:05.003 [WS-CLEANUP] Deleting project workspace... 00:00:05.003 [WS-CLEANUP] Deferred wipeout is used... 00:00:05.008 [WS-CLEANUP] done 00:00:05.011 [Pipeline] setCustomBuildProperty 00:00:05.022 [Pipeline] sh 00:00:05.300 + sudo git config --global --replace-all safe.directory '*' 00:00:05.390 [Pipeline] httpRequest 00:00:06.522 [Pipeline] echo 00:00:06.524 Sorcerer 10.211.164.20 is alive 00:00:06.530 [Pipeline] retry 00:00:06.532 [Pipeline] { 00:00:06.543 [Pipeline] httpRequest 00:00:06.547 HttpMethod: GET 00:00:06.548 URL: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:06.548 Sending request to url: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:06.568 Response Code: HTTP/1.1 200 OK 00:00:06.568 Success: Status code 200 is in the accepted range: 200,404 00:00:06.569 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:09.735 [Pipeline] } 00:00:09.754 [Pipeline] // retry 00:00:09.762 [Pipeline] sh 00:00:10.049 + tar --no-same-owner -xf jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:10.063 [Pipeline] httpRequest 00:00:10.447 [Pipeline] echo 00:00:10.448 Sorcerer 10.211.164.20 is alive 00:00:10.456 [Pipeline] retry 00:00:10.458 [Pipeline] { 00:00:10.471 [Pipeline] httpRequest 00:00:10.475 HttpMethod: GET 00:00:10.475 URL: http://10.211.164.20/packages/spdk_35cd3e84d4a92eacc8c9de6c2cd81450ef5bcc54.tar.gz 00:00:10.476 Sending request to url: http://10.211.164.20/packages/spdk_35cd3e84d4a92eacc8c9de6c2cd81450ef5bcc54.tar.gz 00:00:10.495 Response Code: HTTP/1.1 200 OK 00:00:10.496 Success: Status code 200 is in the accepted range: 200,404 00:00:10.496 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_35cd3e84d4a92eacc8c9de6c2cd81450ef5bcc54.tar.gz 00:02:00.183 [Pipeline] } 00:02:00.199 [Pipeline] // retry 00:02:00.207 [Pipeline] sh 00:02:00.489 + tar --no-same-owner -xf spdk_35cd3e84d4a92eacc8c9de6c2cd81450ef5bcc54.tar.gz 00:02:03.034 [Pipeline] sh 00:02:03.316 + git -C spdk log --oneline -n5 00:02:03.317 35cd3e84d bdev/part: Pass through dif_check_flags via dif_check_flags_exclude_mask 00:02:03.317 01a2c4855 bdev/passthru: Pass through dif_check_flags via dif_check_flags_exclude_mask 00:02:03.317 9094b9600 bdev: Assert to check if I/O pass dif_check_flags not enabled by bdev 00:02:03.317 2e10c84c8 nvmf: Expose DIF type of namespace to host again 00:02:03.317 38b931b23 nvmf: Set bdev_ext_io_opts::dif_check_flags_exclude_mask for read/write 00:02:03.336 [Pipeline] withCredentials 00:02:03.346 > git --version # timeout=10 00:02:03.361 > git --version # 'git version 2.39.2' 00:02:03.378 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:02:03.381 [Pipeline] { 00:02:03.391 [Pipeline] retry 00:02:03.393 [Pipeline] { 00:02:03.410 [Pipeline] sh 00:02:03.695 + git ls-remote http://dpdk.org/git/dpdk-stable v23.11 00:02:03.707 [Pipeline] } 00:02:03.725 [Pipeline] // retry 00:02:03.730 [Pipeline] } 00:02:03.747 [Pipeline] // withCredentials 00:02:03.757 [Pipeline] httpRequest 00:02:04.188 [Pipeline] echo 00:02:04.190 Sorcerer 10.211.164.20 is alive 00:02:04.200 [Pipeline] retry 00:02:04.202 [Pipeline] { 00:02:04.216 [Pipeline] httpRequest 00:02:04.220 HttpMethod: GET 00:02:04.220 URL: http://10.211.164.20/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:02:04.221 Sending request to url: http://10.211.164.20/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:02:04.221 Response Code: HTTP/1.1 200 OK 00:02:04.221 Success: Status code 200 is in the accepted range: 200,404 00:02:04.222 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:02:10.040 [Pipeline] } 00:02:10.054 [Pipeline] // retry 00:02:10.061 [Pipeline] sh 00:02:10.349 + tar --no-same-owner -xf dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:02:11.747 [Pipeline] sh 00:02:12.032 + git -C dpdk log --oneline -n5 00:02:12.032 eeb0605f11 version: 23.11.0 00:02:12.032 238778122a doc: update release notes for 23.11 00:02:12.032 46aa6b3cfc doc: fix description of RSS features 00:02:12.032 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:02:12.032 7e421ae345 devtools: support skipping forbid rule check 00:02:12.042 [Pipeline] } 00:02:12.056 [Pipeline] // stage 00:02:12.065 [Pipeline] stage 00:02:12.067 [Pipeline] { (Prepare) 00:02:12.086 [Pipeline] writeFile 00:02:12.101 [Pipeline] sh 00:02:12.434 + logger -p user.info -t JENKINS-CI 00:02:12.446 [Pipeline] sh 00:02:12.731 + logger -p user.info -t JENKINS-CI 00:02:12.746 [Pipeline] sh 00:02:13.033 + cat autorun-spdk.conf 00:02:13.033 SPDK_RUN_FUNCTIONAL_TEST=1 00:02:13.033 SPDK_TEST_FUZZER_SHORT=1 00:02:13.033 SPDK_TEST_FUZZER=1 00:02:13.033 SPDK_TEST_SETUP=1 00:02:13.033 SPDK_RUN_UBSAN=1 00:02:13.033 SPDK_TEST_NATIVE_DPDK=v23.11 00:02:13.033 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:13.040 RUN_NIGHTLY=1 00:02:13.045 [Pipeline] readFile 00:02:13.069 [Pipeline] withEnv 00:02:13.071 [Pipeline] { 00:02:13.083 [Pipeline] sh 00:02:13.369 + set -ex 00:02:13.369 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:02:13.369 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:13.369 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:13.369 ++ SPDK_TEST_FUZZER_SHORT=1 00:02:13.369 ++ SPDK_TEST_FUZZER=1 00:02:13.369 ++ SPDK_TEST_SETUP=1 00:02:13.369 ++ SPDK_RUN_UBSAN=1 00:02:13.369 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:02:13.369 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:13.369 ++ RUN_NIGHTLY=1 00:02:13.369 + case $SPDK_TEST_NVMF_NICS in 00:02:13.369 + DRIVERS= 00:02:13.369 + [[ -n '' ]] 00:02:13.369 + exit 0 00:02:13.378 [Pipeline] } 00:02:13.396 [Pipeline] // withEnv 00:02:13.400 [Pipeline] } 00:02:13.412 [Pipeline] // stage 00:02:13.419 [Pipeline] catchError 00:02:13.421 [Pipeline] { 00:02:13.432 [Pipeline] timeout 00:02:13.433 Timeout set to expire in 30 min 00:02:13.434 [Pipeline] { 00:02:13.449 [Pipeline] stage 00:02:13.452 [Pipeline] { (Tests) 00:02:13.469 [Pipeline] sh 00:02:13.760 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:13.760 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:13.760 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:02:13.761 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:02:13.761 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:13.761 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:02:13.761 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:02:13.761 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:02:13.761 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:02:13.761 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:02:13.761 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:02:13.761 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:13.761 + source /etc/os-release 00:02:13.761 ++ NAME='Fedora Linux' 00:02:13.761 ++ VERSION='39 (Cloud Edition)' 00:02:13.761 ++ ID=fedora 00:02:13.761 ++ VERSION_ID=39 00:02:13.761 ++ VERSION_CODENAME= 00:02:13.761 ++ PLATFORM_ID=platform:f39 00:02:13.761 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:02:13.761 ++ ANSI_COLOR='0;38;2;60;110;180' 00:02:13.761 ++ LOGO=fedora-logo-icon 00:02:13.761 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:02:13.761 ++ HOME_URL=https://fedoraproject.org/ 00:02:13.761 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:02:13.761 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:02:13.761 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:02:13.761 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:02:13.761 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:02:13.761 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:02:13.761 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:02:13.761 ++ SUPPORT_END=2024-11-12 00:02:13.761 ++ VARIANT='Cloud Edition' 00:02:13.761 ++ VARIANT_ID=cloud 00:02:13.761 + uname -a 00:02:13.761 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:02:13.761 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:02:17.085 Hugepages 00:02:17.085 node hugesize free / total 00:02:17.085 node0 1048576kB 0 / 0 00:02:17.085 node0 2048kB 0 / 0 00:02:17.085 node1 1048576kB 0 / 0 00:02:17.085 node1 2048kB 0 / 0 00:02:17.085 00:02:17.085 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:17.085 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:02:17.085 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:02:17.085 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:02:17.085 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:02:17.085 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:02:17.085 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:02:17.085 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:02:17.085 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:02:17.085 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:02:17.085 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:02:17.085 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:02:17.085 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:02:17.085 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:02:17.085 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:02:17.085 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:02:17.085 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:02:17.085 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:02:17.085 + rm -f /tmp/spdk-ld-path 00:02:17.085 + source autorun-spdk.conf 00:02:17.085 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:17.085 ++ SPDK_TEST_FUZZER_SHORT=1 00:02:17.085 ++ SPDK_TEST_FUZZER=1 00:02:17.085 ++ SPDK_TEST_SETUP=1 00:02:17.085 ++ SPDK_RUN_UBSAN=1 00:02:17.085 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:02:17.085 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:17.085 ++ RUN_NIGHTLY=1 00:02:17.085 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:02:17.086 + [[ -n '' ]] 00:02:17.086 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:17.086 + for M in /var/spdk/build-*-manifest.txt 00:02:17.086 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:02:17.086 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:17.086 + for M in /var/spdk/build-*-manifest.txt 00:02:17.086 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:02:17.086 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:17.086 + for M in /var/spdk/build-*-manifest.txt 00:02:17.086 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:02:17.086 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:17.086 ++ uname 00:02:17.086 + [[ Linux == \L\i\n\u\x ]] 00:02:17.086 + sudo dmesg -T 00:02:17.086 + sudo dmesg --clear 00:02:17.086 + dmesg_pid=1605234 00:02:17.086 + [[ Fedora Linux == FreeBSD ]] 00:02:17.086 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:17.086 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:17.086 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:02:17.086 + [[ -x /usr/src/fio-static/fio ]] 00:02:17.086 + export FIO_BIN=/usr/src/fio-static/fio 00:02:17.086 + sudo dmesg -Tw 00:02:17.086 + FIO_BIN=/usr/src/fio-static/fio 00:02:17.086 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:02:17.086 + [[ ! -v VFIO_QEMU_BIN ]] 00:02:17.086 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:02:17.086 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:17.086 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:17.086 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:02:17.086 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:17.086 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:17.086 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:17.086 19:15:36 -- common/autotest_common.sh@1692 -- $ [[ n == y ]] 00:02:17.086 19:15:36 -- spdk/autorun.sh@20 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:17.086 19:15:36 -- short-fuzz-phy-autotest/autorun-spdk.conf@1 -- $ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:17.086 19:15:36 -- short-fuzz-phy-autotest/autorun-spdk.conf@2 -- $ SPDK_TEST_FUZZER_SHORT=1 00:02:17.086 19:15:36 -- short-fuzz-phy-autotest/autorun-spdk.conf@3 -- $ SPDK_TEST_FUZZER=1 00:02:17.086 19:15:36 -- short-fuzz-phy-autotest/autorun-spdk.conf@4 -- $ SPDK_TEST_SETUP=1 00:02:17.086 19:15:36 -- short-fuzz-phy-autotest/autorun-spdk.conf@5 -- $ SPDK_RUN_UBSAN=1 00:02:17.086 19:15:36 -- short-fuzz-phy-autotest/autorun-spdk.conf@6 -- $ SPDK_TEST_NATIVE_DPDK=v23.11 00:02:17.086 19:15:36 -- short-fuzz-phy-autotest/autorun-spdk.conf@7 -- $ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:17.086 19:15:36 -- short-fuzz-phy-autotest/autorun-spdk.conf@8 -- $ RUN_NIGHTLY=1 00:02:17.086 19:15:36 -- spdk/autorun.sh@22 -- $ trap 'timing_finish || exit 1' EXIT 00:02:17.086 19:15:36 -- spdk/autorun.sh@25 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autobuild.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:17.086 19:15:36 -- common/autotest_common.sh@1692 -- $ [[ n == y ]] 00:02:17.086 19:15:36 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:02:17.086 19:15:36 -- scripts/common.sh@15 -- $ shopt -s extglob 00:02:17.086 19:15:36 -- scripts/common.sh@544 -- $ [[ -e /bin/wpdk_common.sh ]] 00:02:17.086 19:15:36 -- scripts/common.sh@552 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:17.086 19:15:36 -- scripts/common.sh@553 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:17.086 19:15:36 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:17.086 19:15:36 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:17.086 19:15:36 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:17.086 19:15:36 -- paths/export.sh@5 -- $ export PATH 00:02:17.086 19:15:36 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:17.086 19:15:36 -- common/autobuild_common.sh@492 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:02:17.086 19:15:36 -- common/autobuild_common.sh@493 -- $ date +%s 00:02:17.086 19:15:36 -- common/autobuild_common.sh@493 -- $ mktemp -dt spdk_1732904136.XXXXXX 00:02:17.086 19:15:36 -- common/autobuild_common.sh@493 -- $ SPDK_WORKSPACE=/tmp/spdk_1732904136.bEMcpa 00:02:17.086 19:15:36 -- common/autobuild_common.sh@495 -- $ [[ -n '' ]] 00:02:17.086 19:15:36 -- common/autobuild_common.sh@499 -- $ '[' -n v23.11 ']' 00:02:17.086 19:15:36 -- common/autobuild_common.sh@500 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:17.086 19:15:36 -- common/autobuild_common.sh@500 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:02:17.086 19:15:36 -- common/autobuild_common.sh@506 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:02:17.086 19:15:36 -- common/autobuild_common.sh@508 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:02:17.086 19:15:36 -- common/autobuild_common.sh@509 -- $ get_config_params 00:02:17.086 19:15:36 -- common/autotest_common.sh@409 -- $ xtrace_disable 00:02:17.086 19:15:36 -- common/autotest_common.sh@10 -- $ set +x 00:02:17.086 19:15:36 -- common/autobuild_common.sh@509 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:02:17.086 19:15:36 -- common/autobuild_common.sh@511 -- $ start_monitor_resources 00:02:17.086 19:15:36 -- pm/common@17 -- $ local monitor 00:02:17.086 19:15:36 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:17.086 19:15:36 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:17.086 19:15:36 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:17.086 19:15:36 -- pm/common@21 -- $ date +%s 00:02:17.086 19:15:36 -- pm/common@21 -- $ date +%s 00:02:17.086 19:15:36 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:17.086 19:15:36 -- pm/common@25 -- $ sleep 1 00:02:17.086 19:15:36 -- pm/common@21 -- $ date +%s 00:02:17.086 19:15:36 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732904136 00:02:17.086 19:15:36 -- pm/common@21 -- $ date +%s 00:02:17.086 19:15:36 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732904136 00:02:17.086 19:15:36 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732904136 00:02:17.086 19:15:36 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732904136 00:02:17.086 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732904136_collect-vmstat.pm.log 00:02:17.086 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732904136_collect-bmc-pm.bmc.pm.log 00:02:17.086 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732904136_collect-cpu-load.pm.log 00:02:17.086 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732904136_collect-cpu-temp.pm.log 00:02:18.028 19:15:37 -- common/autobuild_common.sh@512 -- $ trap stop_monitor_resources EXIT 00:02:18.028 19:15:37 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:02:18.028 19:15:37 -- spdk/autobuild.sh@12 -- $ umask 022 00:02:18.028 19:15:37 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:18.028 19:15:37 -- spdk/autobuild.sh@16 -- $ date -u 00:02:18.028 Fri Nov 29 06:15:37 PM UTC 2024 00:02:18.028 19:15:37 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:02:18.028 v25.01-pre-276-g35cd3e84d 00:02:18.028 19:15:37 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:02:18.028 19:15:37 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:02:18.028 19:15:37 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:02:18.028 19:15:37 -- common/autotest_common.sh@1105 -- $ '[' 3 -le 1 ']' 00:02:18.028 19:15:37 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:02:18.028 19:15:37 -- common/autotest_common.sh@10 -- $ set +x 00:02:18.028 ************************************ 00:02:18.028 START TEST ubsan 00:02:18.028 ************************************ 00:02:18.028 19:15:37 ubsan -- common/autotest_common.sh@1129 -- $ echo 'using ubsan' 00:02:18.028 using ubsan 00:02:18.028 00:02:18.028 real 0m0.001s 00:02:18.028 user 0m0.000s 00:02:18.028 sys 0m0.000s 00:02:18.028 19:15:37 ubsan -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:02:18.028 19:15:37 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:02:18.028 ************************************ 00:02:18.028 END TEST ubsan 00:02:18.028 ************************************ 00:02:18.288 19:15:37 -- spdk/autobuild.sh@27 -- $ '[' -n v23.11 ']' 00:02:18.289 19:15:37 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:02:18.289 19:15:37 -- common/autobuild_common.sh@449 -- $ run_test build_native_dpdk _build_native_dpdk 00:02:18.289 19:15:37 -- common/autotest_common.sh@1105 -- $ '[' 2 -le 1 ']' 00:02:18.289 19:15:37 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:02:18.289 19:15:37 -- common/autotest_common.sh@10 -- $ set +x 00:02:18.289 ************************************ 00:02:18.289 START TEST build_native_dpdk 00:02:18.289 ************************************ 00:02:18.289 19:15:37 build_native_dpdk -- common/autotest_common.sh@1129 -- $ _build_native_dpdk 00:02:18.289 19:15:37 build_native_dpdk -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:02:18.289 19:15:37 build_native_dpdk -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:02:18.289 19:15:37 build_native_dpdk -- common/autobuild_common.sh@50 -- $ local compiler_version 00:02:18.289 19:15:37 build_native_dpdk -- common/autobuild_common.sh@51 -- $ local compiler 00:02:18.289 19:15:37 build_native_dpdk -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:02:18.289 19:15:37 build_native_dpdk -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:02:18.289 19:15:37 build_native_dpdk -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:02:18.289 19:15:37 build_native_dpdk -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:02:18.289 19:15:37 build_native_dpdk -- common/autobuild_common.sh@61 -- $ CC=gcc 00:02:18.289 19:15:37 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:02:18.289 19:15:37 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:02:18.289 eeb0605f11 version: 23.11.0 00:02:18.289 238778122a doc: update release notes for 23.11 00:02:18.289 46aa6b3cfc doc: fix description of RSS features 00:02:18.289 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:02:18.289 7e421ae345 devtools: support skipping forbid rule check 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@87 -- $ dpdk_ver=23.11.0 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@102 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base" "power/acpi" "power/amd_pstate" "power/cppc" "power/intel_pstate" "power/intel_uncore" "power/kvm_vm") 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@103 -- $ local mlx5_libs_added=n 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@104 -- $ [[ 0 -eq 1 ]] 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@104 -- $ [[ 0 -eq 1 ]] 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@146 -- $ [[ 0 -eq 1 ]] 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@174 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@175 -- $ uname -s 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@175 -- $ '[' Linux = Linux ']' 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@176 -- $ lt 23.11.0 21.11.0 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 23.11.0 '<' 21.11.0 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 23 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@353 -- $ local d=23 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@355 -- $ echo 23 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=23 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 21 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@353 -- $ local d=21 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@355 -- $ echo 21 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=21 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@367 -- $ return 1 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@180 -- $ patch -p1 00:02:18.289 patching file config/rte_config.h 00:02:18.289 Hunk #1 succeeded at 60 (offset 1 line). 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@183 -- $ lt 23.11.0 24.07.0 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 23.11.0 '<' 24.07.0 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 23 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@353 -- $ local d=23 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@355 -- $ echo 23 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=23 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@368 -- $ return 0 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@184 -- $ patch -p1 00:02:18.289 patching file lib/pcapng/rte_pcapng.c 00:02:18.289 19:15:38 build_native_dpdk -- common/autobuild_common.sh@186 -- $ ge 23.11.0 24.07.0 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@376 -- $ cmp_versions 23.11.0 '>=' 24.07.0 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=>=' 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@348 -- $ : 1 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 23 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@353 -- $ local d=23 00:02:18.289 19:15:38 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:02:18.290 19:15:38 build_native_dpdk -- scripts/common.sh@355 -- $ echo 23 00:02:18.290 19:15:38 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=23 00:02:18.290 19:15:38 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:02:18.290 19:15:38 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:02:18.290 19:15:38 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:02:18.290 19:15:38 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:02:18.290 19:15:38 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:02:18.290 19:15:38 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:18.290 19:15:38 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:02:18.290 19:15:38 build_native_dpdk -- scripts/common.sh@368 -- $ return 1 00:02:18.290 19:15:38 build_native_dpdk -- common/autobuild_common.sh@190 -- $ dpdk_kmods=false 00:02:18.290 19:15:38 build_native_dpdk -- common/autobuild_common.sh@191 -- $ uname -s 00:02:18.290 19:15:38 build_native_dpdk -- common/autobuild_common.sh@191 -- $ '[' Linux = FreeBSD ']' 00:02:18.290 19:15:38 build_native_dpdk -- common/autobuild_common.sh@195 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base power/acpi power/amd_pstate power/cppc power/intel_pstate power/intel_uncore power/kvm_vm 00:02:18.290 19:15:38 build_native_dpdk -- common/autobuild_common.sh@195 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base,power/acpi,power/amd_pstate,power/cppc,power/intel_pstate,power/intel_uncore,power/kvm_vm, 00:02:22.491 The Meson build system 00:02:22.491 Version: 1.5.0 00:02:22.491 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:22.491 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:02:22.491 Build type: native build 00:02:22.491 Program cat found: YES (/usr/bin/cat) 00:02:22.491 Project name: DPDK 00:02:22.491 Project version: 23.11.0 00:02:22.491 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:02:22.491 C linker for the host machine: gcc ld.bfd 2.40-14 00:02:22.491 Host machine cpu family: x86_64 00:02:22.491 Host machine cpu: x86_64 00:02:22.491 Message: ## Building in Developer Mode ## 00:02:22.491 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:22.491 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:02:22.491 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:02:22.491 Program python3 found: YES (/usr/bin/python3) 00:02:22.491 Program cat found: YES (/usr/bin/cat) 00:02:22.491 config/meson.build:113: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:02:22.491 Compiler for C supports arguments -march=native: YES 00:02:22.491 Checking for size of "void *" : 8 00:02:22.491 Checking for size of "void *" : 8 (cached) 00:02:22.491 Library m found: YES 00:02:22.491 Library numa found: YES 00:02:22.491 Has header "numaif.h" : YES 00:02:22.491 Library fdt found: NO 00:02:22.491 Library execinfo found: NO 00:02:22.491 Has header "execinfo.h" : YES 00:02:22.491 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:22.491 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:22.491 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:22.491 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:22.491 Run-time dependency openssl found: YES 3.1.1 00:02:22.491 Run-time dependency libpcap found: YES 1.10.4 00:02:22.491 Has header "pcap.h" with dependency libpcap: YES 00:02:22.491 Compiler for C supports arguments -Wcast-qual: YES 00:02:22.491 Compiler for C supports arguments -Wdeprecated: YES 00:02:22.491 Compiler for C supports arguments -Wformat: YES 00:02:22.491 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:22.491 Compiler for C supports arguments -Wformat-security: NO 00:02:22.491 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:22.491 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:22.491 Compiler for C supports arguments -Wnested-externs: YES 00:02:22.491 Compiler for C supports arguments -Wold-style-definition: YES 00:02:22.491 Compiler for C supports arguments -Wpointer-arith: YES 00:02:22.491 Compiler for C supports arguments -Wsign-compare: YES 00:02:22.491 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:22.491 Compiler for C supports arguments -Wundef: YES 00:02:22.491 Compiler for C supports arguments -Wwrite-strings: YES 00:02:22.491 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:22.491 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:22.491 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:22.491 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:22.491 Program objdump found: YES (/usr/bin/objdump) 00:02:22.491 Compiler for C supports arguments -mavx512f: YES 00:02:22.491 Checking if "AVX512 checking" compiles: YES 00:02:22.491 Fetching value of define "__SSE4_2__" : 1 00:02:22.491 Fetching value of define "__AES__" : 1 00:02:22.491 Fetching value of define "__AVX__" : 1 00:02:22.491 Fetching value of define "__AVX2__" : 1 00:02:22.491 Fetching value of define "__AVX512BW__" : 1 00:02:22.491 Fetching value of define "__AVX512CD__" : 1 00:02:22.491 Fetching value of define "__AVX512DQ__" : 1 00:02:22.491 Fetching value of define "__AVX512F__" : 1 00:02:22.491 Fetching value of define "__AVX512VL__" : 1 00:02:22.491 Fetching value of define "__PCLMUL__" : 1 00:02:22.491 Fetching value of define "__RDRND__" : 1 00:02:22.491 Fetching value of define "__RDSEED__" : 1 00:02:22.491 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:22.491 Fetching value of define "__znver1__" : (undefined) 00:02:22.491 Fetching value of define "__znver2__" : (undefined) 00:02:22.491 Fetching value of define "__znver3__" : (undefined) 00:02:22.491 Fetching value of define "__znver4__" : (undefined) 00:02:22.491 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:22.491 Message: lib/log: Defining dependency "log" 00:02:22.491 Message: lib/kvargs: Defining dependency "kvargs" 00:02:22.491 Message: lib/telemetry: Defining dependency "telemetry" 00:02:22.491 Checking for function "getentropy" : NO 00:02:22.491 Message: lib/eal: Defining dependency "eal" 00:02:22.491 Message: lib/ring: Defining dependency "ring" 00:02:22.491 Message: lib/rcu: Defining dependency "rcu" 00:02:22.491 Message: lib/mempool: Defining dependency "mempool" 00:02:22.491 Message: lib/mbuf: Defining dependency "mbuf" 00:02:22.491 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:22.491 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:22.491 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:22.491 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:22.491 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:22.491 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:22.491 Compiler for C supports arguments -mpclmul: YES 00:02:22.491 Compiler for C supports arguments -maes: YES 00:02:22.491 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:22.491 Compiler for C supports arguments -mavx512bw: YES 00:02:22.491 Compiler for C supports arguments -mavx512dq: YES 00:02:22.491 Compiler for C supports arguments -mavx512vl: YES 00:02:22.491 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:22.491 Compiler for C supports arguments -mavx2: YES 00:02:22.491 Compiler for C supports arguments -mavx: YES 00:02:22.491 Message: lib/net: Defining dependency "net" 00:02:22.491 Message: lib/meter: Defining dependency "meter" 00:02:22.491 Message: lib/ethdev: Defining dependency "ethdev" 00:02:22.491 Message: lib/pci: Defining dependency "pci" 00:02:22.491 Message: lib/cmdline: Defining dependency "cmdline" 00:02:22.491 Message: lib/metrics: Defining dependency "metrics" 00:02:22.491 Message: lib/hash: Defining dependency "hash" 00:02:22.491 Message: lib/timer: Defining dependency "timer" 00:02:22.491 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:22.491 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:22.491 Fetching value of define "__AVX512CD__" : 1 (cached) 00:02:22.491 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:22.491 Message: lib/acl: Defining dependency "acl" 00:02:22.491 Message: lib/bbdev: Defining dependency "bbdev" 00:02:22.491 Message: lib/bitratestats: Defining dependency "bitratestats" 00:02:22.491 Run-time dependency libelf found: YES 0.191 00:02:22.491 Message: lib/bpf: Defining dependency "bpf" 00:02:22.491 Message: lib/cfgfile: Defining dependency "cfgfile" 00:02:22.491 Message: lib/compressdev: Defining dependency "compressdev" 00:02:22.491 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:22.491 Message: lib/distributor: Defining dependency "distributor" 00:02:22.491 Message: lib/dmadev: Defining dependency "dmadev" 00:02:22.491 Message: lib/efd: Defining dependency "efd" 00:02:22.491 Message: lib/eventdev: Defining dependency "eventdev" 00:02:22.491 Message: lib/dispatcher: Defining dependency "dispatcher" 00:02:22.491 Message: lib/gpudev: Defining dependency "gpudev" 00:02:22.491 Message: lib/gro: Defining dependency "gro" 00:02:22.491 Message: lib/gso: Defining dependency "gso" 00:02:22.491 Message: lib/ip_frag: Defining dependency "ip_frag" 00:02:22.491 Message: lib/jobstats: Defining dependency "jobstats" 00:02:22.491 Message: lib/latencystats: Defining dependency "latencystats" 00:02:22.491 Message: lib/lpm: Defining dependency "lpm" 00:02:22.491 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:22.491 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:22.491 Fetching value of define "__AVX512IFMA__" : (undefined) 00:02:22.491 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:02:22.491 Message: lib/member: Defining dependency "member" 00:02:22.491 Message: lib/pcapng: Defining dependency "pcapng" 00:02:22.491 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:22.491 Message: lib/power: Defining dependency "power" 00:02:22.491 Message: lib/rawdev: Defining dependency "rawdev" 00:02:22.491 Message: lib/regexdev: Defining dependency "regexdev" 00:02:22.491 Message: lib/mldev: Defining dependency "mldev" 00:02:22.491 Message: lib/rib: Defining dependency "rib" 00:02:22.491 Message: lib/reorder: Defining dependency "reorder" 00:02:22.491 Message: lib/sched: Defining dependency "sched" 00:02:22.491 Message: lib/security: Defining dependency "security" 00:02:22.491 Message: lib/stack: Defining dependency "stack" 00:02:22.491 Has header "linux/userfaultfd.h" : YES 00:02:22.491 Has header "linux/vduse.h" : YES 00:02:22.491 Message: lib/vhost: Defining dependency "vhost" 00:02:22.491 Message: lib/ipsec: Defining dependency "ipsec" 00:02:22.491 Message: lib/pdcp: Defining dependency "pdcp" 00:02:22.491 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:22.491 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:22.491 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:22.491 Message: lib/fib: Defining dependency "fib" 00:02:22.491 Message: lib/port: Defining dependency "port" 00:02:22.491 Message: lib/pdump: Defining dependency "pdump" 00:02:22.491 Message: lib/table: Defining dependency "table" 00:02:22.491 Message: lib/pipeline: Defining dependency "pipeline" 00:02:22.491 Message: lib/graph: Defining dependency "graph" 00:02:22.491 Message: lib/node: Defining dependency "node" 00:02:22.491 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:23.873 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:23.873 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:23.873 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:23.873 Compiler for C supports arguments -Wno-sign-compare: YES 00:02:23.873 Compiler for C supports arguments -Wno-unused-value: YES 00:02:23.873 Compiler for C supports arguments -Wno-format: YES 00:02:23.873 Compiler for C supports arguments -Wno-format-security: YES 00:02:23.873 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:02:23.873 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:02:23.873 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:02:23.873 Compiler for C supports arguments -Wno-unused-parameter: YES 00:02:23.873 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:23.873 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:23.873 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:23.873 Compiler for C supports arguments -mavx512bw: YES (cached) 00:02:23.873 Compiler for C supports arguments -march=skylake-avx512: YES 00:02:23.873 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:02:23.873 Has header "sys/epoll.h" : YES 00:02:23.873 Program doxygen found: YES (/usr/local/bin/doxygen) 00:02:23.873 Configuring doxy-api-html.conf using configuration 00:02:23.873 Configuring doxy-api-man.conf using configuration 00:02:23.873 Program mandb found: YES (/usr/bin/mandb) 00:02:23.873 Program sphinx-build found: NO 00:02:23.873 Configuring rte_build_config.h using configuration 00:02:23.873 Message: 00:02:23.873 ================= 00:02:23.873 Applications Enabled 00:02:23.873 ================= 00:02:23.873 00:02:23.873 apps: 00:02:23.873 dumpcap, graph, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, 00:02:23.873 test-crypto-perf, test-dma-perf, test-eventdev, test-fib, test-flow-perf, test-gpudev, test-mldev, test-pipeline, 00:02:23.873 test-pmd, test-regex, test-sad, test-security-perf, 00:02:23.873 00:02:23.873 Message: 00:02:23.873 ================= 00:02:23.873 Libraries Enabled 00:02:23.873 ================= 00:02:23.873 00:02:23.873 libs: 00:02:23.873 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:23.873 net, meter, ethdev, pci, cmdline, metrics, hash, timer, 00:02:23.873 acl, bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, 00:02:23.873 dmadev, efd, eventdev, dispatcher, gpudev, gro, gso, ip_frag, 00:02:23.873 jobstats, latencystats, lpm, member, pcapng, power, rawdev, regexdev, 00:02:23.873 mldev, rib, reorder, sched, security, stack, vhost, ipsec, 00:02:23.873 pdcp, fib, port, pdump, table, pipeline, graph, node, 00:02:23.873 00:02:23.873 00:02:23.873 Message: 00:02:23.873 =============== 00:02:23.873 Drivers Enabled 00:02:23.873 =============== 00:02:23.873 00:02:23.873 common: 00:02:23.873 00:02:23.873 bus: 00:02:23.873 pci, vdev, 00:02:23.873 mempool: 00:02:23.873 ring, 00:02:23.873 dma: 00:02:23.873 00:02:23.873 net: 00:02:23.873 i40e, 00:02:23.873 raw: 00:02:23.873 00:02:23.873 crypto: 00:02:23.873 00:02:23.873 compress: 00:02:23.873 00:02:23.873 regex: 00:02:23.873 00:02:23.873 ml: 00:02:23.873 00:02:23.873 vdpa: 00:02:23.873 00:02:23.873 event: 00:02:23.873 00:02:23.873 baseband: 00:02:23.873 00:02:23.873 gpu: 00:02:23.873 00:02:23.873 00:02:23.873 Message: 00:02:23.873 ================= 00:02:23.873 Content Skipped 00:02:23.873 ================= 00:02:23.873 00:02:23.873 apps: 00:02:23.873 00:02:23.873 libs: 00:02:23.873 00:02:23.873 drivers: 00:02:23.873 common/cpt: not in enabled drivers build config 00:02:23.873 common/dpaax: not in enabled drivers build config 00:02:23.873 common/iavf: not in enabled drivers build config 00:02:23.873 common/idpf: not in enabled drivers build config 00:02:23.873 common/mvep: not in enabled drivers build config 00:02:23.873 common/octeontx: not in enabled drivers build config 00:02:23.873 bus/auxiliary: not in enabled drivers build config 00:02:23.873 bus/cdx: not in enabled drivers build config 00:02:23.873 bus/dpaa: not in enabled drivers build config 00:02:23.873 bus/fslmc: not in enabled drivers build config 00:02:23.873 bus/ifpga: not in enabled drivers build config 00:02:23.873 bus/platform: not in enabled drivers build config 00:02:23.873 bus/vmbus: not in enabled drivers build config 00:02:23.873 common/cnxk: not in enabled drivers build config 00:02:23.873 common/mlx5: not in enabled drivers build config 00:02:23.873 common/nfp: not in enabled drivers build config 00:02:23.873 common/qat: not in enabled drivers build config 00:02:23.873 common/sfc_efx: not in enabled drivers build config 00:02:23.873 mempool/bucket: not in enabled drivers build config 00:02:23.873 mempool/cnxk: not in enabled drivers build config 00:02:23.873 mempool/dpaa: not in enabled drivers build config 00:02:23.873 mempool/dpaa2: not in enabled drivers build config 00:02:23.873 mempool/octeontx: not in enabled drivers build config 00:02:23.873 mempool/stack: not in enabled drivers build config 00:02:23.873 dma/cnxk: not in enabled drivers build config 00:02:23.873 dma/dpaa: not in enabled drivers build config 00:02:23.873 dma/dpaa2: not in enabled drivers build config 00:02:23.873 dma/hisilicon: not in enabled drivers build config 00:02:23.873 dma/idxd: not in enabled drivers build config 00:02:23.873 dma/ioat: not in enabled drivers build config 00:02:23.874 dma/skeleton: not in enabled drivers build config 00:02:23.874 net/af_packet: not in enabled drivers build config 00:02:23.874 net/af_xdp: not in enabled drivers build config 00:02:23.874 net/ark: not in enabled drivers build config 00:02:23.874 net/atlantic: not in enabled drivers build config 00:02:23.874 net/avp: not in enabled drivers build config 00:02:23.874 net/axgbe: not in enabled drivers build config 00:02:23.874 net/bnx2x: not in enabled drivers build config 00:02:23.874 net/bnxt: not in enabled drivers build config 00:02:23.874 net/bonding: not in enabled drivers build config 00:02:23.874 net/cnxk: not in enabled drivers build config 00:02:23.874 net/cpfl: not in enabled drivers build config 00:02:23.874 net/cxgbe: not in enabled drivers build config 00:02:23.874 net/dpaa: not in enabled drivers build config 00:02:23.874 net/dpaa2: not in enabled drivers build config 00:02:23.874 net/e1000: not in enabled drivers build config 00:02:23.874 net/ena: not in enabled drivers build config 00:02:23.874 net/enetc: not in enabled drivers build config 00:02:23.874 net/enetfec: not in enabled drivers build config 00:02:23.874 net/enic: not in enabled drivers build config 00:02:23.874 net/failsafe: not in enabled drivers build config 00:02:23.874 net/fm10k: not in enabled drivers build config 00:02:23.874 net/gve: not in enabled drivers build config 00:02:23.874 net/hinic: not in enabled drivers build config 00:02:23.874 net/hns3: not in enabled drivers build config 00:02:23.874 net/iavf: not in enabled drivers build config 00:02:23.874 net/ice: not in enabled drivers build config 00:02:23.874 net/idpf: not in enabled drivers build config 00:02:23.874 net/igc: not in enabled drivers build config 00:02:23.874 net/ionic: not in enabled drivers build config 00:02:23.874 net/ipn3ke: not in enabled drivers build config 00:02:23.874 net/ixgbe: not in enabled drivers build config 00:02:23.874 net/mana: not in enabled drivers build config 00:02:23.874 net/memif: not in enabled drivers build config 00:02:23.874 net/mlx4: not in enabled drivers build config 00:02:23.874 net/mlx5: not in enabled drivers build config 00:02:23.874 net/mvneta: not in enabled drivers build config 00:02:23.874 net/mvpp2: not in enabled drivers build config 00:02:23.874 net/netvsc: not in enabled drivers build config 00:02:23.874 net/nfb: not in enabled drivers build config 00:02:23.874 net/nfp: not in enabled drivers build config 00:02:23.874 net/ngbe: not in enabled drivers build config 00:02:23.874 net/null: not in enabled drivers build config 00:02:23.874 net/octeontx: not in enabled drivers build config 00:02:23.874 net/octeon_ep: not in enabled drivers build config 00:02:23.874 net/pcap: not in enabled drivers build config 00:02:23.874 net/pfe: not in enabled drivers build config 00:02:23.874 net/qede: not in enabled drivers build config 00:02:23.874 net/ring: not in enabled drivers build config 00:02:23.874 net/sfc: not in enabled drivers build config 00:02:23.874 net/softnic: not in enabled drivers build config 00:02:23.874 net/tap: not in enabled drivers build config 00:02:23.874 net/thunderx: not in enabled drivers build config 00:02:23.874 net/txgbe: not in enabled drivers build config 00:02:23.874 net/vdev_netvsc: not in enabled drivers build config 00:02:23.874 net/vhost: not in enabled drivers build config 00:02:23.874 net/virtio: not in enabled drivers build config 00:02:23.874 net/vmxnet3: not in enabled drivers build config 00:02:23.874 raw/cnxk_bphy: not in enabled drivers build config 00:02:23.874 raw/cnxk_gpio: not in enabled drivers build config 00:02:23.874 raw/dpaa2_cmdif: not in enabled drivers build config 00:02:23.874 raw/ifpga: not in enabled drivers build config 00:02:23.874 raw/ntb: not in enabled drivers build config 00:02:23.874 raw/skeleton: not in enabled drivers build config 00:02:23.874 crypto/armv8: not in enabled drivers build config 00:02:23.874 crypto/bcmfs: not in enabled drivers build config 00:02:23.874 crypto/caam_jr: not in enabled drivers build config 00:02:23.874 crypto/ccp: not in enabled drivers build config 00:02:23.874 crypto/cnxk: not in enabled drivers build config 00:02:23.874 crypto/dpaa_sec: not in enabled drivers build config 00:02:23.874 crypto/dpaa2_sec: not in enabled drivers build config 00:02:23.874 crypto/ipsec_mb: not in enabled drivers build config 00:02:23.874 crypto/mlx5: not in enabled drivers build config 00:02:23.874 crypto/mvsam: not in enabled drivers build config 00:02:23.874 crypto/nitrox: not in enabled drivers build config 00:02:23.874 crypto/null: not in enabled drivers build config 00:02:23.874 crypto/octeontx: not in enabled drivers build config 00:02:23.874 crypto/openssl: not in enabled drivers build config 00:02:23.874 crypto/scheduler: not in enabled drivers build config 00:02:23.874 crypto/uadk: not in enabled drivers build config 00:02:23.874 crypto/virtio: not in enabled drivers build config 00:02:23.874 compress/isal: not in enabled drivers build config 00:02:23.874 compress/mlx5: not in enabled drivers build config 00:02:23.874 compress/octeontx: not in enabled drivers build config 00:02:23.874 compress/zlib: not in enabled drivers build config 00:02:23.874 regex/mlx5: not in enabled drivers build config 00:02:23.874 regex/cn9k: not in enabled drivers build config 00:02:23.874 ml/cnxk: not in enabled drivers build config 00:02:23.874 vdpa/ifc: not in enabled drivers build config 00:02:23.874 vdpa/mlx5: not in enabled drivers build config 00:02:23.874 vdpa/nfp: not in enabled drivers build config 00:02:23.874 vdpa/sfc: not in enabled drivers build config 00:02:23.874 event/cnxk: not in enabled drivers build config 00:02:23.874 event/dlb2: not in enabled drivers build config 00:02:23.874 event/dpaa: not in enabled drivers build config 00:02:23.874 event/dpaa2: not in enabled drivers build config 00:02:23.874 event/dsw: not in enabled drivers build config 00:02:23.874 event/opdl: not in enabled drivers build config 00:02:23.874 event/skeleton: not in enabled drivers build config 00:02:23.874 event/sw: not in enabled drivers build config 00:02:23.874 event/octeontx: not in enabled drivers build config 00:02:23.874 baseband/acc: not in enabled drivers build config 00:02:23.874 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:02:23.874 baseband/fpga_lte_fec: not in enabled drivers build config 00:02:23.874 baseband/la12xx: not in enabled drivers build config 00:02:23.874 baseband/null: not in enabled drivers build config 00:02:23.874 baseband/turbo_sw: not in enabled drivers build config 00:02:23.874 gpu/cuda: not in enabled drivers build config 00:02:23.874 00:02:23.874 00:02:23.874 Build targets in project: 217 00:02:23.874 00:02:23.874 DPDK 23.11.0 00:02:23.874 00:02:23.874 User defined options 00:02:23.874 libdir : lib 00:02:23.874 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:23.874 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:02:23.874 c_link_args : 00:02:23.874 enable_docs : false 00:02:23.874 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base,power/acpi,power/amd_pstate,power/cppc,power/intel_pstate,power/intel_uncore,power/kvm_vm, 00:02:23.874 enable_kmods : false 00:02:23.874 machine : native 00:02:23.874 tests : false 00:02:23.874 00:02:23.874 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:23.874 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:02:23.874 19:15:43 build_native_dpdk -- common/autobuild_common.sh@199 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:02:23.874 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:24.140 [1/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:24.140 [2/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:24.140 [3/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:24.140 [4/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:24.140 [5/707] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:24.140 [6/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:24.140 [7/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:24.140 [8/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:24.140 [9/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:24.140 [10/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:24.140 [11/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:24.140 [12/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:24.140 [13/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:24.140 [14/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:24.403 [15/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:24.403 [16/707] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:24.403 [17/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:24.403 [18/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:24.403 [19/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:24.403 [20/707] Linking static target lib/librte_kvargs.a 00:02:24.403 [21/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:24.403 [22/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:24.403 [23/707] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:24.403 [24/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:24.403 [25/707] Linking static target lib/librte_pci.a 00:02:24.403 [26/707] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:24.403 [27/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:24.403 [28/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:24.403 [29/707] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:24.403 [30/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:24.403 [31/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:24.403 [32/707] Linking static target lib/librte_log.a 00:02:24.403 [33/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:24.403 [34/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:24.403 [35/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:24.665 [36/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:24.665 [37/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:24.665 [38/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:24.665 [39/707] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.665 [40/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:24.665 [41/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:24.665 [42/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:24.665 [43/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:24.665 [44/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:24.665 [45/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:24.665 [46/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:24.666 [47/707] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.666 [48/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:24.666 [49/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:24.666 [50/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:24.929 [51/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:24.929 [52/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:24.929 [53/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:24.929 [54/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:24.929 [55/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:24.929 [56/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:24.929 [57/707] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:24.929 [58/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:24.929 [59/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:24.929 [60/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:24.929 [61/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:24.929 [62/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:24.929 [63/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:24.929 [64/707] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:24.929 [65/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:24.929 [66/707] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:24.929 [67/707] Linking static target lib/librte_meter.a 00:02:24.929 [68/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:24.929 [69/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:24.929 [70/707] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:24.929 [71/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:24.929 [72/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:24.929 [73/707] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:24.929 [74/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:24.929 [75/707] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:02:24.929 [76/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:24.929 [77/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:24.929 [78/707] Linking static target lib/librte_cmdline.a 00:02:24.929 [79/707] Linking static target lib/librte_ring.a 00:02:24.929 [80/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:24.929 [81/707] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:24.929 [82/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:24.929 [83/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:24.929 [84/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:02:24.929 [85/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:24.929 [86/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:24.929 [87/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:24.929 [88/707] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:24.929 [89/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:24.929 [90/707] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:24.929 [91/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:24.929 [92/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:24.929 [93/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:02:24.929 [94/707] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:24.929 [95/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:24.929 [96/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:24.929 [97/707] Linking static target lib/librte_metrics.a 00:02:24.929 [98/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:02:24.929 [99/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:02:24.929 [100/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:24.929 [101/707] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:24.929 [102/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:24.929 [103/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:02:24.929 [104/707] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:02:24.929 [105/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:24.929 [106/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:24.929 [107/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:02:24.929 [108/707] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:25.194 [109/707] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:25.194 [110/707] Linking static target lib/librte_net.a 00:02:25.194 [111/707] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:02:25.194 [112/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:25.194 [113/707] Linking static target lib/librte_bitratestats.a 00:02:25.194 [114/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:25.194 [115/707] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:02:25.194 [116/707] Linking static target lib/librte_cfgfile.a 00:02:25.194 [117/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:25.194 [118/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:25.194 [119/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:25.194 [120/707] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:25.194 [121/707] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:25.194 [122/707] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.195 [123/707] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:25.195 [124/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:25.195 [125/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:25.195 [126/707] Linking target lib/librte_log.so.24.0 00:02:25.195 [127/707] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:02:25.195 [128/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:25.195 [129/707] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.195 [130/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:25.195 [131/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:25.195 [132/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:25.195 [133/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:25.195 [134/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:25.454 [135/707] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:25.454 [136/707] Linking static target lib/librte_timer.a 00:02:25.454 [137/707] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.454 [138/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:25.454 [139/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:02:25.454 [140/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:25.454 [141/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:25.454 [142/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:02:25.454 [143/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:25.454 [144/707] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.454 [145/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:02:25.454 [146/707] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:02:25.454 [147/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:25.454 [148/707] Linking static target lib/librte_mempool.a 00:02:25.454 [149/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:02:25.454 [150/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:25.454 [151/707] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:02:25.454 [152/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:25.454 [153/707] Linking static target lib/librte_bbdev.a 00:02:25.454 [154/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:02:25.454 [155/707] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:02:25.454 [156/707] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.454 [157/707] Linking target lib/librte_kvargs.so.24.0 00:02:25.454 [158/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:25.454 [159/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:25.454 [160/707] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:25.454 [161/707] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:02:25.718 [162/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:25.718 [163/707] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:02:25.718 [164/707] Linking static target lib/librte_jobstats.a 00:02:25.718 [165/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:02:25.718 [166/707] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:25.718 [167/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:02:25.718 [168/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:02:25.718 [169/707] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:02:25.718 [170/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:25.718 [171/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:02:25.718 [172/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:25.718 [173/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:02:25.718 [174/707] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:02:25.718 [175/707] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.718 [176/707] Linking static target lib/librte_compressdev.a 00:02:25.718 [177/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:02:25.718 [178/707] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:25.718 [179/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:25.718 [180/707] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:25.718 [181/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:02:25.718 [182/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:02:25.718 [183/707] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.718 [184/707] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:02:25.718 [185/707] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:02:25.718 [186/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev_pmd.c.o 00:02:25.718 [187/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp6.c.o 00:02:25.718 [188/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils.c.o 00:02:25.718 [189/707] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:02:25.718 [190/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:02:25.718 [191/707] Compiling C object lib/librte_dispatcher.a.p/dispatcher_rte_dispatcher.c.o 00:02:25.718 [192/707] Linking static target lib/member/libsketch_avx512_tmp.a 00:02:25.718 [193/707] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:25.718 [194/707] Linking static target lib/librte_dispatcher.a 00:02:25.718 [195/707] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:02:25.718 [196/707] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:25.718 [197/707] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:02:25.718 [198/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:25.718 [199/707] Linking static target lib/librte_latencystats.a 00:02:25.980 [200/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:02:25.980 [201/707] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:02:25.980 [202/707] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:25.980 [203/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar_bfloat16.c.o 00:02:25.980 [204/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:25.980 [205/707] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:25.980 [206/707] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:02:25.980 [207/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:25.980 [208/707] Linking static target lib/librte_telemetry.a 00:02:25.980 [209/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:02:25.980 [210/707] Linking static target lib/librte_rcu.a 00:02:25.980 [211/707] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:25.980 [212/707] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.980 [213/707] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:02:25.980 [214/707] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:02:25.980 [215/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:02:25.980 [216/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:02:25.980 [217/707] Linking static target lib/librte_gpudev.a 00:02:25.980 [218/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:02:25.980 [219/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:02:25.980 [220/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:02:25.980 [221/707] Linking static target lib/librte_eal.a 00:02:25.980 [222/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:02:25.980 [223/707] Linking static target lib/librte_stack.a 00:02:25.980 [224/707] Linking static target lib/librte_gro.a 00:02:25.980 [225/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:02:25.980 [226/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:02:25.980 [227/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:02:25.980 [228/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:02:25.980 [229/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:25.980 [230/707] Linking static target lib/librte_distributor.a 00:02:25.980 [231/707] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:02:25.980 [232/707] Linking static target lib/librte_dmadev.a 00:02:25.980 [233/707] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:25.980 [234/707] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:02:25.980 [235/707] Linking static target lib/librte_gso.a 00:02:25.980 [236/707] Linking static target lib/librte_regexdev.a 00:02:25.980 [237/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev.c.o 00:02:25.980 [238/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar.c.o 00:02:25.980 [239/707] Linking static target lib/librte_mldev.a 00:02:25.980 [240/707] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:02:25.980 [241/707] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:02:25.980 [242/707] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:25.980 [243/707] Linking static target lib/librte_rawdev.a 00:02:25.980 [244/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:26.241 [245/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:02:26.241 [246/707] Linking static target lib/librte_power.a 00:02:26.241 [247/707] Linking static target lib/librte_mbuf.a 00:02:26.241 [248/707] Linking static target lib/librte_ip_frag.a 00:02:26.241 [249/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:02:26.241 [250/707] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.241 [251/707] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:02:26.241 [252/707] Linking static target lib/librte_pcapng.a 00:02:26.241 [253/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:02:26.241 [254/707] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:26.242 [255/707] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.242 [256/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_dma_adapter.c.o 00:02:26.242 [257/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_reorder.c.o 00:02:26.242 [258/707] Linking static target lib/librte_reorder.a 00:02:26.242 [259/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:02:26.242 [260/707] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:26.242 [261/707] Linking static target lib/librte_bpf.a 00:02:26.242 [262/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:02:26.242 [263/707] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:26.242 [264/707] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.242 [265/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:02:26.242 [266/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_ctrl_pdu.c.o 00:02:26.242 [267/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:02:26.242 [268/707] Linking static target lib/librte_security.a 00:02:26.242 [269/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_crypto.c.o 00:02:26.242 [270/707] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.242 [271/707] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:26.242 [272/707] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.242 [273/707] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.242 [274/707] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.504 [275/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:26.504 [276/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:26.504 [277/707] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:02:26.504 [278/707] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.504 [279/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:02:26.504 [280/707] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:26.504 [281/707] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:02:26.504 [282/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_cnt.c.o 00:02:26.504 [283/707] Linking static target lib/librte_lpm.a 00:02:26.504 [284/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:02:26.504 [285/707] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:02:26.504 [286/707] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:02:26.504 [287/707] Compiling C object lib/librte_node.a.p/node_null.c.o 00:02:26.504 [288/707] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.504 [289/707] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.504 [290/707] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:26.504 [291/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:02:26.504 [292/707] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.504 [293/707] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:02:26.504 [294/707] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:02:26.504 [295/707] Linking static target lib/librte_rib.a 00:02:26.504 [296/707] Compiling C object lib/librte_pdcp.a.p/pdcp_rte_pdcp.c.o 00:02:26.504 [297/707] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.770 [298/707] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.770 [299/707] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.770 [300/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:26.770 [301/707] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:02:26.770 [302/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:02:26.770 [303/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:02:26.770 [304/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:02:26.770 [305/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:02:26.770 [306/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:26.770 [307/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:02:26.770 [308/707] Generating lib/dispatcher.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.770 [309/707] Linking target lib/librte_telemetry.so.24.0 00:02:26.770 [310/707] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.770 [311/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:02:26.770 [312/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:02:26.770 [313/707] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:02:26.770 [314/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:02:26.770 [315/707] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:02:26.770 [316/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:02:26.770 [317/707] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.770 [318/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:02:26.770 [319/707] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:02:26.770 [320/707] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.770 [321/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:02:26.770 [322/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:02:26.770 [323/707] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:02:27.033 [324/707] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:02:27.033 [325/707] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:02:27.033 [326/707] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.033 [327/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:02:27.033 [328/707] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:02:27.033 [329/707] Linking static target lib/librte_efd.a 00:02:27.033 [330/707] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:02:27.034 [331/707] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:02:27.034 [332/707] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:02:27.034 [333/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_worker.c.o 00:02:27.034 [334/707] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:02:27.034 [335/707] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:02:27.034 [336/707] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:02:27.034 [337/707] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:02:27.034 [338/707] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.034 [339/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:02:27.034 [340/707] Compiling C object lib/librte_graph.a.p/graph_graph_pcap.c.o 00:02:27.034 [341/707] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.034 [342/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:27.034 [343/707] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:02:27.034 [344/707] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.034 [345/707] Compiling C object lib/librte_node.a.p/node_log.c.o 00:02:27.034 [346/707] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:02:27.034 [347/707] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:02:27.034 [348/707] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:02:27.296 [349/707] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:02:27.296 [350/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:02:27.296 [351/707] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:02:27.296 [352/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:02:27.296 [353/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:02:27.296 [354/707] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:02:27.296 [355/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:02:27.296 [356/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:02:27.296 [357/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:27.296 [358/707] Compiling C object lib/librte_node.a.p/node_kernel_tx.c.o 00:02:27.296 [359/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:02:27.296 [360/707] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.296 [361/707] Linking static target lib/librte_fib.a 00:02:27.296 [362/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:02:27.296 [363/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:02:27.296 [364/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:02:27.296 [365/707] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:02:27.296 [366/707] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.296 [367/707] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.296 [368/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:27.296 [369/707] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.296 [370/707] Compiling C object lib/librte_node.a.p/node_ip4_local.c.o 00:02:27.296 [371/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:27.296 [372/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:02:27.562 [373/707] Compiling C object lib/librte_node.a.p/node_ip4_reassembly.c.o 00:02:27.562 [374/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:02:27.562 [375/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:27.562 [376/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:27.562 [377/707] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.562 [378/707] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:27.562 [379/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:27.562 [380/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:02:27.562 [381/707] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:02:27.562 [382/707] Linking static target lib/librte_pdump.a 00:02:27.562 [383/707] Compiling C object app/dpdk-graph.p/graph_cli.c.o 00:02:27.562 [384/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:02:27.562 [385/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:02:27.562 [386/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:27.562 [387/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_model_mcore_dispatch.c.o 00:02:27.562 [388/707] Compiling C object lib/librte_node.a.p/node_udp4_input.c.o 00:02:27.562 [389/707] Linking static target lib/librte_graph.a 00:02:27.562 [390/707] Compiling C object app/dpdk-graph.p/graph_conn.c.o 00:02:27.562 [391/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:02:27.562 [392/707] Compiling C object app/dpdk-graph.p/graph_ip4_route.c.o 00:02:27.562 [393/707] Compiling C object app/dpdk-graph.p/graph_ip6_route.c.o 00:02:27.562 [394/707] Compiling C object app/dpdk-graph.p/graph_mempool.c.o 00:02:27.562 [395/707] Compiling C object app/dpdk-graph.p/graph_ethdev_rx.c.o 00:02:27.562 [396/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:02:27.854 [397/707] Compiling C object app/dpdk-graph.p/graph_utils.c.o 00:02:27.854 [398/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:27.854 [399/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:02:27.854 [400/707] Compiling C object app/dpdk-graph.p/graph_main.c.o 00:02:27.854 [401/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_test.c.o 00:02:27.854 [402/707] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:27.854 [403/707] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:02:27.854 [404/707] Compiling C object app/dpdk-graph.p/graph_l3fwd.c.o 00:02:27.854 [405/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:02:27.854 [406/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:02:27.854 [407/707] Compiling C object lib/librte_node.a.p/node_kernel_rx.c.o 00:02:27.854 [408/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:02:27.854 [409/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:02:27.854 [410/707] Compiling C object app/dpdk-graph.p/graph_graph.c.o 00:02:27.854 [411/707] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:27.854 [412/707] Linking static target lib/librte_table.a 00:02:27.854 [413/707] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:02:27.854 [414/707] Compiling C object lib/librte_node.a.p/node_ip6_lookup.c.o 00:02:27.854 [415/707] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.854 [416/707] Linking static target lib/librte_sched.a 00:02:27.854 [417/707] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:27.854 [418/707] Compiling C object app/dpdk-graph.p/graph_neigh.c.o 00:02:27.854 [419/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_parser.c.o 00:02:27.854 [420/707] Linking static target drivers/librte_bus_vdev.a 00:02:27.854 [421/707] Compiling C object app/dpdk-graph.p/graph_ethdev.c.o 00:02:27.854 [422/707] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:27.854 [423/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:02:27.854 [424/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:02:27.854 [425/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:02:27.854 [426/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_recycle_mbufs_vec_common.c.o 00:02:27.854 [427/707] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.116 [428/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:02:28.116 [429/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:02:28.116 [430/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:02:28.116 [431/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:28.116 [432/707] Linking static target lib/librte_cryptodev.a 00:02:28.116 [433/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:02:28.116 [434/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:02:28.116 [435/707] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:02:28.116 [436/707] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:28.116 [437/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:02:28.116 [438/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:02:28.116 [439/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ipsec.c.o 00:02:28.116 [440/707] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:02:28.116 [441/707] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:28.116 [442/707] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:28.116 [443/707] Linking static target drivers/librte_bus_pci.a 00:02:28.116 [444/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_main.c.o 00:02:28.116 [445/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:02:28.116 [446/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_main.c.o 00:02:28.116 [447/707] Linking static target lib/librte_ipsec.a 00:02:28.116 [448/707] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:02:28.116 [449/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:02:28.116 [450/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:02:28.377 [451/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:02:28.377 [452/707] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:28.377 [453/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:02:28.377 [454/707] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:28.377 [455/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_common.c.o 00:02:28.377 [456/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_ordered.c.o 00:02:28.377 [457/707] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:02:28.377 [458/707] Linking static target lib/librte_member.a 00:02:28.377 [459/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:02:28.377 [460/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:02:28.377 [461/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_device_ops.c.o 00:02:28.377 [462/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:02:28.377 [463/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:02:28.377 [464/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_stats.c.o 00:02:28.377 [465/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_interleave.c.o 00:02:28.377 [466/707] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:02:28.377 [467/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:02:28.377 [468/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:02:28.377 [469/707] Generating lib/mldev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.377 [470/707] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.377 [471/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_options.c.o 00:02:28.377 [472/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_common.c.o 00:02:28.377 [473/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_ops.c.o 00:02:28.377 [474/707] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:02:28.377 [475/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_benchmark.c.o 00:02:28.377 [476/707] Compiling C object lib/librte_node.a.p/node_ip6_rewrite.c.o 00:02:28.377 [477/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:02:28.377 [478/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:02:28.377 [479/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_process.c.o 00:02:28.636 [480/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:02:28.637 [481/707] Linking static target lib/librte_node.a 00:02:28.637 [482/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:02:28.637 [483/707] Linking static target lib/librte_pdcp.a 00:02:28.637 [484/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:02:28.637 [485/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:02:28.637 [486/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:02:28.637 [487/707] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.637 [488/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:02:28.637 [489/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:02:28.637 [490/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:02:28.637 [491/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:02:28.637 [492/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:02:28.637 [493/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:02:28.637 [494/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:02:28.637 [495/707] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:28.637 [496/707] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:28.637 [497/707] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:28.637 [498/707] Linking static target drivers/librte_mempool_ring.a 00:02:28.637 [499/707] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.637 [500/707] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.637 [501/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:02:28.637 [502/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:02:28.637 [503/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:02:28.637 [504/707] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:28.637 [505/707] Linking static target lib/librte_hash.a 00:02:28.637 [506/707] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:02:28.637 [507/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:02:28.637 [508/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_cman.c.o 00:02:28.637 [509/707] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:02:28.637 [510/707] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.637 [511/707] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:02:28.637 [512/707] Linking static target lib/librte_port.a 00:02:28.637 [513/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:02:28.637 [514/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:02:28.896 [515/707] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:02:28.896 [516/707] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:02:28.896 [517/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:02:28.896 [518/707] Compiling C object app/dpdk-testpmd.p/test-pmd_recycle_mbufs.c.o 00:02:28.896 [519/707] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.896 [520/707] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:02:28.896 [521/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:02:28.896 [522/707] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:02:28.896 [523/707] Linking static target lib/librte_eventdev.a 00:02:28.896 [524/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:02:28.896 [525/707] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.896 [526/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:02:28.896 [527/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:02:28.896 [528/707] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.896 [529/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:02:28.896 [530/707] Generating lib/pdcp.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.896 [531/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:02:28.896 [532/707] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:02:28.896 [533/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:02:28.896 [534/707] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:02:28.896 [535/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:02:28.896 [536/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:02:28.896 [537/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:02:28.896 [538/707] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:02:28.896 [539/707] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:02:28.896 [540/707] Compiling C object lib/acl/libavx2_tmp.a.p/acl_run_avx2.c.o 00:02:28.896 [541/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:02:28.896 [542/707] Linking static target lib/acl/libavx2_tmp.a 00:02:28.896 [543/707] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:02:28.896 [544/707] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:02:29.156 [545/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:02:29.156 [546/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:02:29.156 [547/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:02:29.156 [548/707] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:02:29.156 [549/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:02:29.156 [550/707] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:02:29.156 [551/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:02:29.156 [552/707] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:02:29.156 [553/707] Linking static target lib/librte_acl.a 00:02:29.156 [554/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:02:29.156 [555/707] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:02:29.156 [556/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:02:29.156 [557/707] Linking static target drivers/net/i40e/base/libi40e_base.a 00:02:29.416 [558/707] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:02:29.416 [559/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:02:29.416 [560/707] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:02:29.416 [561/707] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:02:29.416 [562/707] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:02:29.416 [563/707] Compiling C object drivers/net/i40e/libi40e_avx2_lib.a.p/i40e_rxtx_vec_avx2.c.o 00:02:29.416 [564/707] Linking static target drivers/net/i40e/libi40e_avx2_lib.a 00:02:29.416 [565/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:02:29.416 [566/707] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.416 [567/707] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.675 [568/707] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:02:29.675 [569/707] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.675 [570/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_common.c.o 00:02:29.675 [571/707] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:02:29.935 [572/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:29.935 [573/707] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:02:29.935 [574/707] Linking static target lib/librte_ethdev.a 00:02:29.935 [575/707] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.194 [576/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:02:30.194 [577/707] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:02:30.763 [578/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:02:30.763 [579/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:02:30.763 [580/707] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:02:31.333 [581/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:02:31.333 [582/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:02:31.333 [583/707] Linking static target drivers/libtmp_rte_net_i40e.a 00:02:31.593 [584/707] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:02:31.593 [585/707] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:31.593 [586/707] Compiling C object drivers/librte_net_i40e.so.24.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:31.852 [587/707] Linking static target drivers/librte_net_i40e.a 00:02:31.852 [588/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:32.421 [589/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:02:32.421 [590/707] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.989 [591/707] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.989 [592/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:02:38.265 [593/707] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:38.265 [594/707] Linking target lib/librte_eal.so.24.0 00:02:38.265 [595/707] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:02:38.265 [596/707] Linking target lib/librte_timer.so.24.0 00:02:38.265 [597/707] Linking target lib/librte_jobstats.so.24.0 00:02:38.265 [598/707] Linking target lib/librte_rawdev.so.24.0 00:02:38.265 [599/707] Linking target lib/librte_stack.so.24.0 00:02:38.265 [600/707] Linking target lib/librte_meter.so.24.0 00:02:38.265 [601/707] Linking target lib/librte_pci.so.24.0 00:02:38.265 [602/707] Linking target lib/librte_cfgfile.so.24.0 00:02:38.265 [603/707] Linking target lib/librte_ring.so.24.0 00:02:38.265 [604/707] Linking target lib/librte_dmadev.so.24.0 00:02:38.265 [605/707] Linking target drivers/librte_bus_vdev.so.24.0 00:02:38.265 [606/707] Linking target lib/librte_acl.so.24.0 00:02:38.525 [607/707] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:02:38.525 [608/707] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:02:38.525 [609/707] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:02:38.525 [610/707] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:02:38.525 [611/707] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:02:38.525 [612/707] Generating symbol file drivers/librte_bus_vdev.so.24.0.p/librte_bus_vdev.so.24.0.symbols 00:02:38.525 [613/707] Generating symbol file lib/librte_acl.so.24.0.p/librte_acl.so.24.0.symbols 00:02:38.525 [614/707] Linking target drivers/librte_bus_pci.so.24.0 00:02:38.525 [615/707] Linking target lib/librte_rcu.so.24.0 00:02:38.525 [616/707] Linking target lib/librte_mempool.so.24.0 00:02:38.784 [617/707] Generating symbol file drivers/librte_bus_pci.so.24.0.p/librte_bus_pci.so.24.0.symbols 00:02:38.784 [618/707] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:02:38.784 [619/707] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:02:38.784 [620/707] Linking target drivers/librte_mempool_ring.so.24.0 00:02:38.784 [621/707] Linking target lib/librte_rib.so.24.0 00:02:38.784 [622/707] Linking target lib/librte_mbuf.so.24.0 00:02:38.784 [623/707] Generating symbol file lib/librte_rib.so.24.0.p/librte_rib.so.24.0.symbols 00:02:38.784 [624/707] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:02:39.043 [625/707] Linking target lib/librte_fib.so.24.0 00:02:39.043 [626/707] Linking target lib/librte_net.so.24.0 00:02:39.043 [627/707] Linking target lib/librte_sched.so.24.0 00:02:39.043 [628/707] Linking target lib/librte_compressdev.so.24.0 00:02:39.043 [629/707] Linking target lib/librte_bbdev.so.24.0 00:02:39.043 [630/707] Linking target lib/librte_gpudev.so.24.0 00:02:39.043 [631/707] Linking target lib/librte_reorder.so.24.0 00:02:39.043 [632/707] Linking target lib/librte_distributor.so.24.0 00:02:39.043 [633/707] Linking target lib/librte_cryptodev.so.24.0 00:02:39.043 [634/707] Linking target lib/librte_mldev.so.24.0 00:02:39.043 [635/707] Linking target lib/librte_regexdev.so.24.0 00:02:39.043 [636/707] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:39.043 [637/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:02:39.043 [638/707] Linking static target lib/librte_pipeline.a 00:02:39.043 [639/707] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:02:39.043 [640/707] Generating symbol file lib/librte_reorder.so.24.0.p/librte_reorder.so.24.0.symbols 00:02:39.043 [641/707] Generating symbol file lib/librte_sched.so.24.0.p/librte_sched.so.24.0.symbols 00:02:39.043 [642/707] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:02:39.043 [643/707] Linking target lib/librte_cmdline.so.24.0 00:02:39.043 [644/707] Linking target lib/librte_hash.so.24.0 00:02:39.043 [645/707] Linking target lib/librte_security.so.24.0 00:02:39.043 [646/707] Linking target lib/librte_ethdev.so.24.0 00:02:39.302 [647/707] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:02:39.302 [648/707] Generating symbol file lib/librte_security.so.24.0.p/librte_security.so.24.0.symbols 00:02:39.302 [649/707] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:02:39.302 [650/707] Linking target lib/librte_lpm.so.24.0 00:02:39.302 [651/707] Linking target lib/librte_ipsec.so.24.0 00:02:39.302 [652/707] Linking target lib/librte_efd.so.24.0 00:02:39.302 [653/707] Linking target lib/librte_member.so.24.0 00:02:39.302 [654/707] Linking target lib/librte_pdcp.so.24.0 00:02:39.302 [655/707] Linking target lib/librte_pcapng.so.24.0 00:02:39.302 [656/707] Linking target lib/librte_power.so.24.0 00:02:39.302 [657/707] Linking target lib/librte_metrics.so.24.0 00:02:39.302 [658/707] Linking target lib/librte_bpf.so.24.0 00:02:39.302 [659/707] Linking target lib/librte_gso.so.24.0 00:02:39.302 [660/707] Linking target lib/librte_eventdev.so.24.0 00:02:39.302 [661/707] Linking target lib/librte_gro.so.24.0 00:02:39.302 [662/707] Linking target lib/librte_ip_frag.so.24.0 00:02:39.302 [663/707] Linking target drivers/librte_net_i40e.so.24.0 00:02:39.561 [664/707] Generating symbol file lib/librte_lpm.so.24.0.p/librte_lpm.so.24.0.symbols 00:02:39.561 [665/707] Generating symbol file lib/librte_ipsec.so.24.0.p/librte_ipsec.so.24.0.symbols 00:02:39.561 [666/707] Generating symbol file lib/librte_pcapng.so.24.0.p/librte_pcapng.so.24.0.symbols 00:02:39.561 [667/707] Generating symbol file lib/librte_eventdev.so.24.0.p/librte_eventdev.so.24.0.symbols 00:02:39.561 [668/707] Generating symbol file lib/librte_ip_frag.so.24.0.p/librte_ip_frag.so.24.0.symbols 00:02:39.561 [669/707] Generating symbol file lib/librte_bpf.so.24.0.p/librte_bpf.so.24.0.symbols 00:02:39.561 [670/707] Generating symbol file lib/librte_metrics.so.24.0.p/librte_metrics.so.24.0.symbols 00:02:39.561 [671/707] Linking target lib/librte_graph.so.24.0 00:02:39.561 [672/707] Linking target lib/librte_dispatcher.so.24.0 00:02:39.561 [673/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:39.561 [674/707] Linking target lib/librte_bitratestats.so.24.0 00:02:39.561 [675/707] Linking target lib/librte_latencystats.so.24.0 00:02:39.561 [676/707] Linking target lib/librte_pdump.so.24.0 00:02:39.561 [677/707] Linking target lib/librte_port.so.24.0 00:02:39.561 [678/707] Linking static target lib/librte_vhost.a 00:02:39.561 [679/707] Generating symbol file lib/librte_graph.so.24.0.p/librte_graph.so.24.0.symbols 00:02:39.820 [680/707] Generating symbol file lib/librte_port.so.24.0.p/librte_port.so.24.0.symbols 00:02:39.820 [681/707] Linking target lib/librte_node.so.24.0 00:02:39.820 [682/707] Linking target lib/librte_table.so.24.0 00:02:39.820 [683/707] Generating symbol file lib/librte_table.so.24.0.p/librte_table.so.24.0.symbols 00:02:40.080 [684/707] Linking target app/dpdk-test-acl 00:02:40.080 [685/707] Linking target app/dpdk-pdump 00:02:40.080 [686/707] Linking target app/dpdk-dumpcap 00:02:40.080 [687/707] Linking target app/dpdk-graph 00:02:40.080 [688/707] Linking target app/dpdk-test-cmdline 00:02:40.080 [689/707] Linking target app/dpdk-test-gpudev 00:02:40.080 [690/707] Linking target app/dpdk-test-sad 00:02:40.080 [691/707] Linking target app/dpdk-test-security-perf 00:02:40.080 [692/707] Linking target app/dpdk-test-flow-perf 00:02:40.080 [693/707] Linking target app/dpdk-test-compress-perf 00:02:40.080 [694/707] Linking target app/dpdk-proc-info 00:02:40.080 [695/707] Linking target app/dpdk-test-pipeline 00:02:40.080 [696/707] Linking target app/dpdk-test-bbdev 00:02:40.080 [697/707] Linking target app/dpdk-test-crypto-perf 00:02:40.080 [698/707] Linking target app/dpdk-test-eventdev 00:02:40.080 [699/707] Linking target app/dpdk-test-dma-perf 00:02:40.080 [700/707] Linking target app/dpdk-test-fib 00:02:40.080 [701/707] Linking target app/dpdk-test-mldev 00:02:40.080 [702/707] Linking target app/dpdk-test-regex 00:02:40.080 [703/707] Linking target app/dpdk-testpmd 00:02:41.986 [704/707] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:41.986 [705/707] Linking target lib/librte_vhost.so.24.0 00:02:44.522 [706/707] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:44.522 [707/707] Linking target lib/librte_pipeline.so.24.0 00:02:44.522 19:16:04 build_native_dpdk -- common/autobuild_common.sh@201 -- $ uname -s 00:02:44.522 19:16:04 build_native_dpdk -- common/autobuild_common.sh@201 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:44.522 19:16:04 build_native_dpdk -- common/autobuild_common.sh@214 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:02:44.781 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:44.781 [0/1] Installing files. 00:02:45.045 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.045 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:45.046 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec_sa.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.047 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:45.048 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.049 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:45.050 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:45.050 Installing lib/librte_log.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.050 Installing lib/librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.050 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.050 Installing lib/librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.050 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.050 Installing lib/librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.050 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.050 Installing lib/librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.050 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.050 Installing lib/librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.050 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.050 Installing lib/librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.050 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.050 Installing lib/librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.050 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.050 Installing lib/librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.050 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.050 Installing lib/librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.050 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.050 Installing lib/librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_dispatcher.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.051 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_mldev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_pdcp.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.328 Installing lib/librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.329 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.329 Installing lib/librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.329 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.329 Installing lib/librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.329 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.329 Installing lib/librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.329 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.329 Installing drivers/librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:45.329 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.329 Installing drivers/librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:45.329 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.329 Installing drivers/librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:45.329 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.329 Installing drivers/librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:45.329 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.329 Installing app/dpdk-graph to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.329 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.329 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.329 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.329 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.329 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.329 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.329 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.329 Installing app/dpdk-test-dma-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.329 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.329 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.329 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.329 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.329 Installing app/dpdk-test-mldev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.329 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.329 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.329 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.329 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.329 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/log/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lock_annotations.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_stdatomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_dtls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_pdcp_hdr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.329 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_dma_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dispatcher/rte_dispatcher.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.330 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_mcore_dispatch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_rtc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip6_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_udp4_input_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/dpdk-cmdline-gen.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-rss-flows.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:45.331 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:45.331 Installing symlink pointing to librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so.24 00:02:45.331 Installing symlink pointing to librte_log.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so 00:02:45.331 Installing symlink pointing to librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.24 00:02:45.331 Installing symlink pointing to librte_kvargs.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:02:45.331 Installing symlink pointing to librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.24 00:02:45.331 Installing symlink pointing to librte_telemetry.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:02:45.331 Installing symlink pointing to librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.24 00:02:45.331 Installing symlink pointing to librte_eal.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:02:45.331 Installing symlink pointing to librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.24 00:02:45.331 Installing symlink pointing to librte_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:02:45.331 Installing symlink pointing to librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.24 00:02:45.331 Installing symlink pointing to librte_rcu.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:02:45.331 Installing symlink pointing to librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.24 00:02:45.331 Installing symlink pointing to librte_mempool.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:02:45.331 Installing symlink pointing to librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.24 00:02:45.331 Installing symlink pointing to librte_mbuf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:02:45.331 Installing symlink pointing to librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.24 00:02:45.331 Installing symlink pointing to librte_net.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:02:45.331 Installing symlink pointing to librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.24 00:02:45.331 Installing symlink pointing to librte_meter.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:02:45.331 Installing symlink pointing to librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.24 00:02:45.331 Installing symlink pointing to librte_ethdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:02:45.331 Installing symlink pointing to librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.24 00:02:45.331 Installing symlink pointing to librte_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:02:45.331 Installing symlink pointing to librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.24 00:02:45.331 Installing symlink pointing to librte_cmdline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:02:45.331 Installing symlink pointing to librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.24 00:02:45.331 Installing symlink pointing to librte_metrics.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:02:45.331 Installing symlink pointing to librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.24 00:02:45.331 Installing symlink pointing to librte_hash.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:02:45.331 Installing symlink pointing to librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.24 00:02:45.331 Installing symlink pointing to librte_timer.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:02:45.331 Installing symlink pointing to librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.24 00:02:45.331 Installing symlink pointing to librte_acl.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:02:45.331 Installing symlink pointing to librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.24 00:02:45.331 Installing symlink pointing to librte_bbdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:02:45.331 Installing symlink pointing to librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.24 00:02:45.331 Installing symlink pointing to librte_bitratestats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:02:45.331 Installing symlink pointing to librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.24 00:02:45.331 Installing symlink pointing to librte_bpf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:02:45.331 Installing symlink pointing to librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.24 00:02:45.331 Installing symlink pointing to librte_cfgfile.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:02:45.331 Installing symlink pointing to librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.24 00:02:45.331 Installing symlink pointing to librte_compressdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:02:45.331 Installing symlink pointing to librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.24 00:02:45.331 Installing symlink pointing to librte_cryptodev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:02:45.331 Installing symlink pointing to librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.24 00:02:45.331 Installing symlink pointing to librte_distributor.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:02:45.331 Installing symlink pointing to librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.24 00:02:45.331 Installing symlink pointing to librte_dmadev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:02:45.331 Installing symlink pointing to librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.24 00:02:45.331 Installing symlink pointing to librte_efd.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:02:45.331 Installing symlink pointing to librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.24 00:02:45.331 Installing symlink pointing to librte_eventdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:02:45.331 Installing symlink pointing to librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so.24 00:02:45.331 Installing symlink pointing to librte_dispatcher.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so 00:02:45.331 Installing symlink pointing to librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.24 00:02:45.331 Installing symlink pointing to librte_gpudev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:02:45.331 Installing symlink pointing to librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.24 00:02:45.331 Installing symlink pointing to librte_gro.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:02:45.331 Installing symlink pointing to librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.24 00:02:45.331 Installing symlink pointing to librte_gso.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:02:45.331 Installing symlink pointing to librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.24 00:02:45.331 Installing symlink pointing to librte_ip_frag.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:02:45.331 Installing symlink pointing to librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.24 00:02:45.331 Installing symlink pointing to librte_jobstats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:02:45.331 Installing symlink pointing to librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.24 00:02:45.331 Installing symlink pointing to librte_latencystats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:02:45.331 Installing symlink pointing to librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.24 00:02:45.331 Installing symlink pointing to librte_lpm.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:02:45.331 Installing symlink pointing to librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.24 00:02:45.331 Installing symlink pointing to librte_member.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:02:45.331 Installing symlink pointing to librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.24 00:02:45.331 Installing symlink pointing to librte_pcapng.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:02:45.331 Installing symlink pointing to librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.24 00:02:45.331 Installing symlink pointing to librte_power.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:02:45.331 Installing symlink pointing to librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.24 00:02:45.331 Installing symlink pointing to librte_rawdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:02:45.331 Installing symlink pointing to librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.24 00:02:45.331 Installing symlink pointing to librte_regexdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:02:45.331 Installing symlink pointing to librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so.24 00:02:45.331 Installing symlink pointing to librte_mldev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so 00:02:45.331 Installing symlink pointing to librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.24 00:02:45.331 Installing symlink pointing to librte_rib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:02:45.331 Installing symlink pointing to librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.24 00:02:45.331 Installing symlink pointing to librte_reorder.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:02:45.331 Installing symlink pointing to librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.24 00:02:45.331 Installing symlink pointing to librte_sched.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:02:45.331 Installing symlink pointing to librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.24 00:02:45.331 Installing symlink pointing to librte_security.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:02:45.331 Installing symlink pointing to librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.24 00:02:45.331 Installing symlink pointing to librte_stack.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:02:45.331 Installing symlink pointing to librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.24 00:02:45.331 Installing symlink pointing to librte_vhost.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:02:45.331 Installing symlink pointing to librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.24 00:02:45.331 Installing symlink pointing to librte_ipsec.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:02:45.331 Installing symlink pointing to librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so.24 00:02:45.331 Installing symlink pointing to librte_pdcp.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so 00:02:45.331 Installing symlink pointing to librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.24 00:02:45.331 Installing symlink pointing to librte_fib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:02:45.331 Installing symlink pointing to librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.24 00:02:45.331 Installing symlink pointing to librte_port.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:02:45.331 Installing symlink pointing to librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.24 00:02:45.331 Installing symlink pointing to librte_pdump.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:02:45.331 Installing symlink pointing to librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.24 00:02:45.331 Installing symlink pointing to librte_table.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:02:45.331 Installing symlink pointing to librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.24 00:02:45.331 Installing symlink pointing to librte_pipeline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:02:45.331 Installing symlink pointing to librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.24 00:02:45.331 Installing symlink pointing to librte_graph.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:02:45.331 Installing symlink pointing to librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.24 00:02:45.331 './librte_bus_pci.so' -> 'dpdk/pmds-24.0/librte_bus_pci.so' 00:02:45.331 './librte_bus_pci.so.24' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24' 00:02:45.331 './librte_bus_pci.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24.0' 00:02:45.331 './librte_bus_vdev.so' -> 'dpdk/pmds-24.0/librte_bus_vdev.so' 00:02:45.331 './librte_bus_vdev.so.24' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24' 00:02:45.331 './librte_bus_vdev.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24.0' 00:02:45.331 './librte_mempool_ring.so' -> 'dpdk/pmds-24.0/librte_mempool_ring.so' 00:02:45.331 './librte_mempool_ring.so.24' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24' 00:02:45.331 './librte_mempool_ring.so.24.0' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24.0' 00:02:45.331 './librte_net_i40e.so' -> 'dpdk/pmds-24.0/librte_net_i40e.so' 00:02:45.332 './librte_net_i40e.so.24' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24' 00:02:45.332 './librte_net_i40e.so.24.0' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24.0' 00:02:45.332 Installing symlink pointing to librte_node.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:02:45.332 Installing symlink pointing to librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so.24 00:02:45.332 Installing symlink pointing to librte_bus_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so 00:02:45.332 Installing symlink pointing to librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so.24 00:02:45.332 Installing symlink pointing to librte_bus_vdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so 00:02:45.332 Installing symlink pointing to librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so.24 00:02:45.332 Installing symlink pointing to librte_mempool_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so 00:02:45.332 Installing symlink pointing to librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so.24 00:02:45.332 Installing symlink pointing to librte_net_i40e.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so 00:02:45.332 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-24.0' 00:02:45.590 19:16:05 build_native_dpdk -- common/autobuild_common.sh@220 -- $ cat 00:02:45.590 19:16:05 build_native_dpdk -- common/autobuild_common.sh@225 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:45.590 00:02:45.590 real 0m27.254s 00:02:45.590 user 8m2.114s 00:02:45.590 sys 2m33.374s 00:02:45.590 19:16:05 build_native_dpdk -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:02:45.590 19:16:05 build_native_dpdk -- common/autotest_common.sh@10 -- $ set +x 00:02:45.590 ************************************ 00:02:45.590 END TEST build_native_dpdk 00:02:45.590 ************************************ 00:02:45.590 19:16:05 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:45.590 19:16:05 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:45.590 19:16:05 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:02:45.590 19:16:05 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:02:45.590 19:16:05 -- common/autobuild_common.sh@445 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:02:45.590 19:16:05 -- common/autotest_common.sh@1105 -- $ '[' 2 -le 1 ']' 00:02:45.590 19:16:05 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:02:45.590 19:16:05 -- common/autotest_common.sh@10 -- $ set +x 00:02:45.590 ************************************ 00:02:45.590 START TEST autobuild_llvm_precompile 00:02:45.590 ************************************ 00:02:45.590 19:16:05 autobuild_llvm_precompile -- common/autotest_common.sh@1129 -- $ _llvm_precompile 00:02:45.590 19:16:05 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ clang --version 00:02:45.590 19:16:05 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:02:45.590 Target: x86_64-redhat-linux-gnu 00:02:45.590 Thread model: posix 00:02:45.590 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:02:45.590 19:16:05 autobuild_llvm_precompile -- common/autobuild_common.sh@33 -- $ clang_num=17 00:02:45.590 19:16:05 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:02:45.590 19:16:05 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:02:45.590 19:16:05 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:02:45.590 19:16:05 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:02:45.590 19:16:05 autobuild_llvm_precompile -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:02:45.590 19:16:05 autobuild_llvm_precompile -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:45.590 19:16:05 autobuild_llvm_precompile -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:02:45.590 19:16:05 autobuild_llvm_precompile -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:02:45.590 19:16:05 autobuild_llvm_precompile -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:45.850 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:46.110 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:46.110 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:46.110 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:46.368 Using 'verbs' RDMA provider 00:03:02.189 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:14.401 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:14.918 Creating mk/config.mk...done. 00:03:14.918 Creating mk/cc.flags.mk...done. 00:03:14.918 Type 'make' to build. 00:03:14.918 00:03:14.918 real 0m29.476s 00:03:14.918 user 0m12.835s 00:03:14.918 sys 0m16.024s 00:03:14.918 19:16:34 autobuild_llvm_precompile -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:03:14.918 19:16:34 autobuild_llvm_precompile -- common/autotest_common.sh@10 -- $ set +x 00:03:14.918 ************************************ 00:03:14.918 END TEST autobuild_llvm_precompile 00:03:14.918 ************************************ 00:03:15.176 19:16:34 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:03:15.176 19:16:34 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:03:15.176 19:16:34 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:03:15.176 19:16:34 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:03:15.176 19:16:34 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:03:15.176 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:03:15.434 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:15.434 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:15.692 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:03:15.951 Using 'verbs' RDMA provider 00:03:29.097 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:41.456 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:41.456 Creating mk/config.mk...done. 00:03:41.456 Creating mk/cc.flags.mk...done. 00:03:41.456 Type 'make' to build. 00:03:41.456 19:16:59 -- spdk/autobuild.sh@70 -- $ run_test make make -j112 00:03:41.456 19:16:59 -- common/autotest_common.sh@1105 -- $ '[' 3 -le 1 ']' 00:03:41.456 19:16:59 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:03:41.456 19:16:59 -- common/autotest_common.sh@10 -- $ set +x 00:03:41.456 ************************************ 00:03:41.456 START TEST make 00:03:41.456 ************************************ 00:03:41.456 19:16:59 make -- common/autotest_common.sh@1129 -- $ make -j112 00:03:41.456 make[1]: Nothing to be done for 'all'. 00:03:42.393 The Meson build system 00:03:42.393 Version: 1.5.0 00:03:42.393 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:03:42.393 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:42.393 Build type: native build 00:03:42.393 Project name: libvfio-user 00:03:42.393 Project version: 0.0.1 00:03:42.393 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:03:42.393 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:03:42.393 Host machine cpu family: x86_64 00:03:42.393 Host machine cpu: x86_64 00:03:42.393 Run-time dependency threads found: YES 00:03:42.393 Library dl found: YES 00:03:42.393 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:03:42.393 Run-time dependency json-c found: YES 0.17 00:03:42.393 Run-time dependency cmocka found: YES 1.1.7 00:03:42.393 Program pytest-3 found: NO 00:03:42.393 Program flake8 found: NO 00:03:42.393 Program misspell-fixer found: NO 00:03:42.393 Program restructuredtext-lint found: NO 00:03:42.393 Program valgrind found: YES (/usr/bin/valgrind) 00:03:42.393 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:42.393 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:42.393 Compiler for C supports arguments -Wwrite-strings: YES 00:03:42.393 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:42.393 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:03:42.393 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:03:42.393 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:42.393 Build targets in project: 8 00:03:42.393 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:03:42.393 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:03:42.393 00:03:42.393 libvfio-user 0.0.1 00:03:42.393 00:03:42.393 User defined options 00:03:42.393 buildtype : debug 00:03:42.393 default_library: static 00:03:42.393 libdir : /usr/local/lib 00:03:42.393 00:03:42.393 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:42.653 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:42.653 [1/36] Compiling C object samples/null.p/null.c.o 00:03:42.653 [2/36] Compiling C object samples/lspci.p/lspci.c.o 00:03:42.653 [3/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:03:42.653 [4/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:03:42.653 [5/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:03:42.653 [6/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:03:42.653 [7/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:03:42.653 [8/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:03:42.653 [9/36] Compiling C object test/unit_tests.p/mocks.c.o 00:03:42.653 [10/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:03:42.653 [11/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:03:42.653 [12/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:03:42.653 [13/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:03:42.653 [14/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:03:42.653 [15/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:03:42.653 [16/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:03:42.653 [17/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:03:42.653 [18/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:03:42.653 [19/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:03:42.653 [20/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:03:42.653 [21/36] Compiling C object samples/server.p/server.c.o 00:03:42.653 [22/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:03:42.653 [23/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:03:42.653 [24/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:03:42.653 [25/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:03:42.653 [26/36] Compiling C object samples/client.p/client.c.o 00:03:42.653 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:03:42.653 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:03:42.653 [29/36] Linking target samples/client 00:03:42.653 [30/36] Linking static target lib/libvfio-user.a 00:03:42.653 [31/36] Linking target test/unit_tests 00:03:42.653 [32/36] Linking target samples/lspci 00:03:42.912 [33/36] Linking target samples/server 00:03:42.912 [34/36] Linking target samples/null 00:03:42.912 [35/36] Linking target samples/gpio-pci-idio-16 00:03:42.912 [36/36] Linking target samples/shadow_ioeventfd_server 00:03:42.912 INFO: autodetecting backend as ninja 00:03:42.913 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:42.913 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:43.172 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:43.172 ninja: no work to do. 00:03:55.395 CC lib/ut/ut.o 00:03:55.395 CC lib/log/log.o 00:03:55.395 CC lib/log/log_flags.o 00:03:55.395 CC lib/log/log_deprecated.o 00:03:55.663 CC lib/ut_mock/mock.o 00:03:55.663 LIB libspdk_ut.a 00:03:55.663 LIB libspdk_log.a 00:03:55.663 LIB libspdk_ut_mock.a 00:03:55.923 CC lib/dma/dma.o 00:03:55.923 CC lib/ioat/ioat.o 00:03:55.923 CC lib/util/base64.o 00:03:55.923 CC lib/util/cpuset.o 00:03:55.923 CC lib/util/bit_array.o 00:03:55.923 CC lib/util/crc16.o 00:03:55.923 CXX lib/trace_parser/trace.o 00:03:55.923 CC lib/util/crc32.o 00:03:55.923 CC lib/util/crc32c.o 00:03:55.923 CC lib/util/crc32_ieee.o 00:03:55.923 CC lib/util/dif.o 00:03:55.923 CC lib/util/crc64.o 00:03:55.923 CC lib/util/fd.o 00:03:55.923 CC lib/util/fd_group.o 00:03:55.923 CC lib/util/file.o 00:03:55.923 CC lib/util/hexlify.o 00:03:55.923 CC lib/util/iov.o 00:03:55.923 CC lib/util/math.o 00:03:55.923 CC lib/util/net.o 00:03:55.923 CC lib/util/pipe.o 00:03:55.923 CC lib/util/strerror_tls.o 00:03:55.923 CC lib/util/string.o 00:03:55.923 CC lib/util/xor.o 00:03:55.923 CC lib/util/uuid.o 00:03:55.923 CC lib/util/zipf.o 00:03:55.923 CC lib/util/md5.o 00:03:56.183 LIB libspdk_dma.a 00:03:56.183 CC lib/vfio_user/host/vfio_user.o 00:03:56.183 CC lib/vfio_user/host/vfio_user_pci.o 00:03:56.183 LIB libspdk_ioat.a 00:03:56.183 LIB libspdk_vfio_user.a 00:03:56.183 LIB libspdk_util.a 00:03:56.443 LIB libspdk_trace_parser.a 00:03:56.703 CC lib/json/json_util.o 00:03:56.703 CC lib/json/json_parse.o 00:03:56.703 CC lib/json/json_write.o 00:03:56.703 CC lib/rdma_utils/rdma_utils.o 00:03:56.703 CC lib/idxd/idxd_kernel.o 00:03:56.703 CC lib/idxd/idxd.o 00:03:56.703 CC lib/idxd/idxd_user.o 00:03:56.703 CC lib/vmd/led.o 00:03:56.703 CC lib/vmd/vmd.o 00:03:56.704 CC lib/conf/conf.o 00:03:56.704 CC lib/env_dpdk/env.o 00:03:56.704 CC lib/env_dpdk/memory.o 00:03:56.704 CC lib/env_dpdk/pci.o 00:03:56.704 CC lib/env_dpdk/init.o 00:03:56.704 CC lib/env_dpdk/threads.o 00:03:56.704 CC lib/env_dpdk/pci_ioat.o 00:03:56.704 CC lib/env_dpdk/pci_virtio.o 00:03:56.704 CC lib/env_dpdk/pci_vmd.o 00:03:56.704 CC lib/env_dpdk/pci_idxd.o 00:03:56.704 CC lib/env_dpdk/pci_event.o 00:03:56.704 CC lib/env_dpdk/sigbus_handler.o 00:03:56.704 CC lib/env_dpdk/pci_dpdk.o 00:03:56.704 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:56.704 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:56.704 LIB libspdk_conf.a 00:03:56.704 LIB libspdk_rdma_utils.a 00:03:56.704 LIB libspdk_json.a 00:03:56.964 LIB libspdk_idxd.a 00:03:56.964 LIB libspdk_vmd.a 00:03:56.964 CC lib/jsonrpc/jsonrpc_server.o 00:03:56.964 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:56.964 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:56.964 CC lib/jsonrpc/jsonrpc_client.o 00:03:56.964 CC lib/rdma_provider/common.o 00:03:56.964 CC lib/rdma_provider/rdma_provider_verbs.o 00:03:57.223 LIB libspdk_rdma_provider.a 00:03:57.223 LIB libspdk_jsonrpc.a 00:03:57.483 LIB libspdk_env_dpdk.a 00:03:57.483 CC lib/rpc/rpc.o 00:03:57.742 LIB libspdk_rpc.a 00:03:58.002 CC lib/notify/notify.o 00:03:58.002 CC lib/notify/notify_rpc.o 00:03:58.002 CC lib/keyring/keyring_rpc.o 00:03:58.002 CC lib/keyring/keyring.o 00:03:58.002 CC lib/trace/trace.o 00:03:58.002 CC lib/trace/trace_flags.o 00:03:58.002 CC lib/trace/trace_rpc.o 00:03:58.002 LIB libspdk_notify.a 00:03:58.261 LIB libspdk_keyring.a 00:03:58.261 LIB libspdk_trace.a 00:03:58.521 CC lib/sock/sock.o 00:03:58.521 CC lib/sock/sock_rpc.o 00:03:58.521 CC lib/thread/thread.o 00:03:58.521 CC lib/thread/iobuf.o 00:03:58.780 LIB libspdk_sock.a 00:03:59.039 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:59.039 CC lib/nvme/nvme_ctrlr.o 00:03:59.039 CC lib/nvme/nvme_fabric.o 00:03:59.039 CC lib/nvme/nvme_ns_cmd.o 00:03:59.039 CC lib/nvme/nvme_ns.o 00:03:59.039 CC lib/nvme/nvme_pcie_common.o 00:03:59.039 CC lib/nvme/nvme_pcie.o 00:03:59.039 CC lib/nvme/nvme_qpair.o 00:03:59.039 CC lib/nvme/nvme_discovery.o 00:03:59.039 CC lib/nvme/nvme.o 00:03:59.039 CC lib/nvme/nvme_quirks.o 00:03:59.039 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:59.039 CC lib/nvme/nvme_transport.o 00:03:59.039 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:59.039 CC lib/nvme/nvme_tcp.o 00:03:59.039 CC lib/nvme/nvme_opal.o 00:03:59.039 CC lib/nvme/nvme_io_msg.o 00:03:59.039 CC lib/nvme/nvme_poll_group.o 00:03:59.039 CC lib/nvme/nvme_zns.o 00:03:59.039 CC lib/nvme/nvme_stubs.o 00:03:59.039 CC lib/nvme/nvme_auth.o 00:03:59.039 CC lib/nvme/nvme_cuse.o 00:03:59.039 CC lib/nvme/nvme_vfio_user.o 00:03:59.039 CC lib/nvme/nvme_rdma.o 00:03:59.298 LIB libspdk_thread.a 00:03:59.558 CC lib/virtio/virtio.o 00:03:59.558 CC lib/virtio/virtio_pci.o 00:03:59.558 CC lib/virtio/virtio_vhost_user.o 00:03:59.558 CC lib/virtio/virtio_vfio_user.o 00:03:59.558 CC lib/blob/blobstore.o 00:03:59.558 CC lib/blob/blob_bs_dev.o 00:03:59.558 CC lib/blob/zeroes.o 00:03:59.558 CC lib/blob/request.o 00:03:59.558 CC lib/vfu_tgt/tgt_rpc.o 00:03:59.558 CC lib/vfu_tgt/tgt_endpoint.o 00:03:59.558 CC lib/accel/accel_rpc.o 00:03:59.558 CC lib/accel/accel.o 00:03:59.558 CC lib/accel/accel_sw.o 00:03:59.558 CC lib/fsdev/fsdev.o 00:03:59.558 CC lib/fsdev/fsdev_io.o 00:03:59.558 CC lib/init/subsystem_rpc.o 00:03:59.558 CC lib/init/json_config.o 00:03:59.558 CC lib/init/subsystem.o 00:03:59.558 CC lib/fsdev/fsdev_rpc.o 00:03:59.558 CC lib/init/rpc.o 00:03:59.817 LIB libspdk_init.a 00:03:59.817 LIB libspdk_virtio.a 00:03:59.817 LIB libspdk_vfu_tgt.a 00:03:59.817 LIB libspdk_fsdev.a 00:04:00.076 CC lib/event/app.o 00:04:00.076 CC lib/event/app_rpc.o 00:04:00.076 CC lib/event/reactor.o 00:04:00.076 CC lib/event/log_rpc.o 00:04:00.076 CC lib/event/scheduler_static.o 00:04:00.335 CC lib/fuse_dispatcher/fuse_dispatcher.o 00:04:00.335 LIB libspdk_event.a 00:04:00.335 LIB libspdk_accel.a 00:04:00.335 LIB libspdk_nvme.a 00:04:00.594 CC lib/bdev/bdev.o 00:04:00.594 CC lib/bdev/part.o 00:04:00.594 CC lib/bdev/bdev_rpc.o 00:04:00.594 CC lib/bdev/bdev_zone.o 00:04:00.594 CC lib/bdev/scsi_nvme.o 00:04:00.594 LIB libspdk_fuse_dispatcher.a 00:04:01.164 LIB libspdk_blob.a 00:04:01.423 CC lib/lvol/lvol.o 00:04:01.683 CC lib/blobfs/blobfs.o 00:04:01.683 CC lib/blobfs/tree.o 00:04:01.942 LIB libspdk_lvol.a 00:04:01.942 LIB libspdk_blobfs.a 00:04:02.201 LIB libspdk_bdev.a 00:04:02.769 CC lib/ftl/ftl_core.o 00:04:02.769 CC lib/ftl/ftl_init.o 00:04:02.769 CC lib/ftl/ftl_layout.o 00:04:02.769 CC lib/ftl/ftl_debug.o 00:04:02.769 CC lib/ftl/ftl_io.o 00:04:02.769 CC lib/ftl/ftl_sb.o 00:04:02.769 CC lib/ftl/ftl_l2p.o 00:04:02.769 CC lib/ftl/ftl_l2p_flat.o 00:04:02.769 CC lib/ftl/ftl_nv_cache.o 00:04:02.769 CC lib/ftl/ftl_writer.o 00:04:02.769 CC lib/ftl/ftl_band.o 00:04:02.769 CC lib/ftl/ftl_band_ops.o 00:04:02.769 CC lib/ftl/ftl_rq.o 00:04:02.769 CC lib/ftl/ftl_reloc.o 00:04:02.769 CC lib/ftl/ftl_l2p_cache.o 00:04:02.769 CC lib/ftl/ftl_p2l.o 00:04:02.769 CC lib/ftl/ftl_p2l_log.o 00:04:02.769 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:04:02.769 CC lib/ftl/mngt/ftl_mngt.o 00:04:02.769 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:04:02.769 CC lib/ftl/mngt/ftl_mngt_startup.o 00:04:02.769 CC lib/ftl/mngt/ftl_mngt_md.o 00:04:02.769 CC lib/ftl/mngt/ftl_mngt_misc.o 00:04:02.770 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:04:02.770 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:04:02.770 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:04:02.770 CC lib/ftl/mngt/ftl_mngt_band.o 00:04:02.770 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:04:02.770 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:04:02.770 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:04:02.770 CC lib/ftl/utils/ftl_conf.o 00:04:02.770 CC lib/ftl/utils/ftl_md.o 00:04:02.770 CC lib/nvmf/ctrlr.o 00:04:02.770 CC lib/ftl/utils/ftl_mempool.o 00:04:02.770 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:04:02.770 CC lib/ftl/utils/ftl_bitmap.o 00:04:02.770 CC lib/ftl/utils/ftl_property.o 00:04:02.770 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:04:02.770 CC lib/nvmf/ctrlr_discovery.o 00:04:02.770 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:04:02.770 CC lib/nvmf/subsystem.o 00:04:02.770 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:04:02.770 CC lib/nbd/nbd_rpc.o 00:04:02.770 CC lib/nbd/nbd.o 00:04:02.770 CC lib/nvmf/ctrlr_bdev.o 00:04:02.770 CC lib/scsi/dev.o 00:04:02.770 CC lib/nvmf/nvmf_rpc.o 00:04:02.770 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:04:02.770 CC lib/nvmf/nvmf.o 00:04:02.770 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:04:02.770 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:04:02.770 CC lib/nvmf/transport.o 00:04:02.770 CC lib/nvmf/tcp.o 00:04:02.770 CC lib/scsi/lun.o 00:04:02.770 CC lib/scsi/port.o 00:04:02.770 CC lib/nvmf/stubs.o 00:04:02.770 CC lib/scsi/scsi.o 00:04:02.770 CC lib/ftl/upgrade/ftl_sb_v3.o 00:04:02.770 CC lib/scsi/scsi_bdev.o 00:04:02.770 CC lib/ublk/ublk.o 00:04:02.770 CC lib/ublk/ublk_rpc.o 00:04:02.770 CC lib/nvmf/mdns_server.o 00:04:02.770 CC lib/ftl/upgrade/ftl_sb_v5.o 00:04:02.770 CC lib/nvmf/rdma.o 00:04:02.770 CC lib/ftl/nvc/ftl_nvc_dev.o 00:04:02.770 CC lib/ftl/nvc/ftl_nvc_bdev_non_vss.o 00:04:02.770 CC lib/ftl/nvc/ftl_nvc_bdev_common.o 00:04:02.770 CC lib/nvmf/vfio_user.o 00:04:02.770 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:04:02.770 CC lib/scsi/scsi_rpc.o 00:04:02.770 CC lib/scsi/scsi_pr.o 00:04:02.770 CC lib/nvmf/auth.o 00:04:02.770 CC lib/ftl/base/ftl_base_dev.o 00:04:02.770 CC lib/scsi/task.o 00:04:02.770 CC lib/ftl/ftl_trace.o 00:04:02.770 CC lib/ftl/base/ftl_base_bdev.o 00:04:03.029 LIB libspdk_nbd.a 00:04:03.029 LIB libspdk_scsi.a 00:04:03.288 LIB libspdk_ublk.a 00:04:03.288 CC lib/vhost/vhost_scsi.o 00:04:03.288 CC lib/vhost/vhost.o 00:04:03.288 CC lib/vhost/vhost_rpc.o 00:04:03.288 CC lib/vhost/vhost_blk.o 00:04:03.288 CC lib/vhost/rte_vhost_user.o 00:04:03.288 CC lib/iscsi/init_grp.o 00:04:03.288 CC lib/iscsi/iscsi.o 00:04:03.288 CC lib/iscsi/conn.o 00:04:03.288 CC lib/iscsi/param.o 00:04:03.288 CC lib/iscsi/portal_grp.o 00:04:03.288 CC lib/iscsi/iscsi_rpc.o 00:04:03.288 CC lib/iscsi/tgt_node.o 00:04:03.288 CC lib/iscsi/iscsi_subsystem.o 00:04:03.288 CC lib/iscsi/task.o 00:04:03.288 LIB libspdk_ftl.a 00:04:03.856 LIB libspdk_nvmf.a 00:04:03.856 LIB libspdk_vhost.a 00:04:04.116 LIB libspdk_iscsi.a 00:04:04.686 CC module/env_dpdk/env_dpdk_rpc.o 00:04:04.686 CC module/vfu_device/vfu_virtio.o 00:04:04.686 CC module/vfu_device/vfu_virtio_blk.o 00:04:04.686 CC module/vfu_device/vfu_virtio_fs.o 00:04:04.686 CC module/vfu_device/vfu_virtio_scsi.o 00:04:04.686 CC module/vfu_device/vfu_virtio_rpc.o 00:04:04.686 CC module/keyring/linux/keyring_rpc.o 00:04:04.686 CC module/keyring/linux/keyring.o 00:04:04.686 CC module/keyring/file/keyring.o 00:04:04.686 CC module/keyring/file/keyring_rpc.o 00:04:04.686 CC module/accel/dsa/accel_dsa_rpc.o 00:04:04.686 CC module/accel/dsa/accel_dsa.o 00:04:04.686 LIB libspdk_env_dpdk_rpc.a 00:04:04.686 CC module/accel/error/accel_error_rpc.o 00:04:04.686 CC module/accel/error/accel_error.o 00:04:04.686 CC module/accel/ioat/accel_ioat.o 00:04:04.686 CC module/blob/bdev/blob_bdev.o 00:04:04.686 CC module/scheduler/gscheduler/gscheduler.o 00:04:04.686 CC module/sock/posix/posix.o 00:04:04.686 CC module/accel/ioat/accel_ioat_rpc.o 00:04:04.686 CC module/accel/iaa/accel_iaa.o 00:04:04.686 CC module/accel/iaa/accel_iaa_rpc.o 00:04:04.686 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:04:04.686 CC module/fsdev/aio/fsdev_aio.o 00:04:04.686 CC module/fsdev/aio/fsdev_aio_rpc.o 00:04:04.686 CC module/scheduler/dynamic/scheduler_dynamic.o 00:04:04.686 CC module/fsdev/aio/linux_aio_mgr.o 00:04:04.686 LIB libspdk_keyring_linux.a 00:04:04.686 LIB libspdk_keyring_file.a 00:04:04.686 LIB libspdk_scheduler_gscheduler.a 00:04:04.686 LIB libspdk_scheduler_dpdk_governor.a 00:04:04.686 LIB libspdk_accel_error.a 00:04:04.686 LIB libspdk_accel_ioat.a 00:04:04.945 LIB libspdk_accel_iaa.a 00:04:04.945 LIB libspdk_scheduler_dynamic.a 00:04:04.945 LIB libspdk_blob_bdev.a 00:04:04.945 LIB libspdk_accel_dsa.a 00:04:04.945 LIB libspdk_vfu_device.a 00:04:05.204 LIB libspdk_sock_posix.a 00:04:05.204 LIB libspdk_fsdev_aio.a 00:04:05.204 CC module/bdev/split/vbdev_split_rpc.o 00:04:05.204 CC module/blobfs/bdev/blobfs_bdev.o 00:04:05.204 CC module/bdev/passthru/vbdev_passthru.o 00:04:05.204 CC module/bdev/split/vbdev_split.o 00:04:05.204 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:04:05.204 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:04:05.204 CC module/bdev/malloc/bdev_malloc.o 00:04:05.204 CC module/bdev/aio/bdev_aio.o 00:04:05.204 CC module/bdev/malloc/bdev_malloc_rpc.o 00:04:05.204 CC module/bdev/ftl/bdev_ftl_rpc.o 00:04:05.204 CC module/bdev/null/bdev_null_rpc.o 00:04:05.204 CC module/bdev/ftl/bdev_ftl.o 00:04:05.204 CC module/bdev/aio/bdev_aio_rpc.o 00:04:05.204 CC module/bdev/null/bdev_null.o 00:04:05.204 CC module/bdev/gpt/gpt.o 00:04:05.204 CC module/bdev/gpt/vbdev_gpt.o 00:04:05.204 CC module/bdev/virtio/bdev_virtio_blk.o 00:04:05.204 CC module/bdev/virtio/bdev_virtio_scsi.o 00:04:05.204 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:04:05.204 CC module/bdev/iscsi/bdev_iscsi.o 00:04:05.204 CC module/bdev/virtio/bdev_virtio_rpc.o 00:04:05.204 CC module/bdev/error/vbdev_error.o 00:04:05.204 CC module/bdev/error/vbdev_error_rpc.o 00:04:05.204 CC module/bdev/raid/bdev_raid.o 00:04:05.204 CC module/bdev/raid/bdev_raid_rpc.o 00:04:05.204 CC module/bdev/nvme/bdev_nvme.o 00:04:05.204 CC module/bdev/raid/bdev_raid_sb.o 00:04:05.204 CC module/bdev/raid/raid0.o 00:04:05.204 CC module/bdev/nvme/bdev_nvme_rpc.o 00:04:05.204 CC module/bdev/nvme/bdev_mdns_client.o 00:04:05.204 CC module/bdev/raid/raid1.o 00:04:05.204 CC module/bdev/raid/concat.o 00:04:05.204 CC module/bdev/nvme/nvme_rpc.o 00:04:05.204 CC module/bdev/nvme/vbdev_opal.o 00:04:05.204 CC module/bdev/nvme/vbdev_opal_rpc.o 00:04:05.204 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:04:05.204 CC module/bdev/delay/vbdev_delay.o 00:04:05.204 CC module/bdev/delay/vbdev_delay_rpc.o 00:04:05.205 CC module/bdev/zone_block/vbdev_zone_block.o 00:04:05.205 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:04:05.205 CC module/bdev/lvol/vbdev_lvol.o 00:04:05.205 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:04:05.462 LIB libspdk_blobfs_bdev.a 00:04:05.462 LIB libspdk_bdev_split.a 00:04:05.462 LIB libspdk_bdev_gpt.a 00:04:05.462 LIB libspdk_bdev_ftl.a 00:04:05.462 LIB libspdk_bdev_passthru.a 00:04:05.462 LIB libspdk_bdev_null.a 00:04:05.462 LIB libspdk_bdev_error.a 00:04:05.462 LIB libspdk_bdev_aio.a 00:04:05.462 LIB libspdk_bdev_iscsi.a 00:04:05.462 LIB libspdk_bdev_zone_block.a 00:04:05.462 LIB libspdk_bdev_malloc.a 00:04:05.462 LIB libspdk_bdev_delay.a 00:04:05.720 LIB libspdk_bdev_virtio.a 00:04:05.720 LIB libspdk_bdev_lvol.a 00:04:05.979 LIB libspdk_bdev_raid.a 00:04:06.914 LIB libspdk_bdev_nvme.a 00:04:07.172 CC module/event/subsystems/iobuf/iobuf.o 00:04:07.172 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:04:07.172 CC module/event/subsystems/fsdev/fsdev.o 00:04:07.172 CC module/event/subsystems/vmd/vmd_rpc.o 00:04:07.172 CC module/event/subsystems/vmd/vmd.o 00:04:07.172 CC module/event/subsystems/scheduler/scheduler.o 00:04:07.172 CC module/event/subsystems/keyring/keyring.o 00:04:07.172 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:04:07.172 CC module/event/subsystems/sock/sock.o 00:04:07.172 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:04:07.430 LIB libspdk_event_iobuf.a 00:04:07.430 LIB libspdk_event_fsdev.a 00:04:07.430 LIB libspdk_event_vmd.a 00:04:07.430 LIB libspdk_event_keyring.a 00:04:07.430 LIB libspdk_event_vfu_tgt.a 00:04:07.430 LIB libspdk_event_scheduler.a 00:04:07.430 LIB libspdk_event_sock.a 00:04:07.430 LIB libspdk_event_vhost_blk.a 00:04:07.688 CC module/event/subsystems/accel/accel.o 00:04:07.688 LIB libspdk_event_accel.a 00:04:07.946 CC module/event/subsystems/bdev/bdev.o 00:04:08.204 LIB libspdk_event_bdev.a 00:04:08.462 CC module/event/subsystems/scsi/scsi.o 00:04:08.462 CC module/event/subsystems/nbd/nbd.o 00:04:08.462 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:04:08.462 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:04:08.462 CC module/event/subsystems/ublk/ublk.o 00:04:08.462 LIB libspdk_event_scsi.a 00:04:08.462 LIB libspdk_event_nbd.a 00:04:08.721 LIB libspdk_event_ublk.a 00:04:08.721 LIB libspdk_event_nvmf.a 00:04:08.980 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:04:08.980 CC module/event/subsystems/iscsi/iscsi.o 00:04:08.980 LIB libspdk_event_vhost_scsi.a 00:04:08.980 LIB libspdk_event_iscsi.a 00:04:09.238 CXX app/trace/trace.o 00:04:09.238 TEST_HEADER include/spdk/accel.h 00:04:09.238 CC app/spdk_lspci/spdk_lspci.o 00:04:09.238 CC app/spdk_nvme_discover/discovery_aer.o 00:04:09.238 TEST_HEADER include/spdk/base64.h 00:04:09.238 TEST_HEADER include/spdk/accel_module.h 00:04:09.238 TEST_HEADER include/spdk/assert.h 00:04:09.238 TEST_HEADER include/spdk/bdev.h 00:04:09.238 TEST_HEADER include/spdk/barrier.h 00:04:09.238 TEST_HEADER include/spdk/bdev_module.h 00:04:09.238 TEST_HEADER include/spdk/bdev_zone.h 00:04:09.238 TEST_HEADER include/spdk/bit_array.h 00:04:09.238 TEST_HEADER include/spdk/blob_bdev.h 00:04:09.238 TEST_HEADER include/spdk/bit_pool.h 00:04:09.238 TEST_HEADER include/spdk/blobfs.h 00:04:09.238 TEST_HEADER include/spdk/blobfs_bdev.h 00:04:09.238 TEST_HEADER include/spdk/conf.h 00:04:09.238 TEST_HEADER include/spdk/blob.h 00:04:09.238 CC app/spdk_top/spdk_top.o 00:04:09.238 TEST_HEADER include/spdk/config.h 00:04:09.238 CC app/trace_record/trace_record.o 00:04:09.238 TEST_HEADER include/spdk/cpuset.h 00:04:09.238 TEST_HEADER include/spdk/crc16.h 00:04:09.238 TEST_HEADER include/spdk/crc64.h 00:04:09.238 CC test/rpc_client/rpc_client_test.o 00:04:09.238 TEST_HEADER include/spdk/crc32.h 00:04:09.238 CC app/spdk_nvme_perf/perf.o 00:04:09.238 TEST_HEADER include/spdk/dif.h 00:04:09.238 TEST_HEADER include/spdk/dma.h 00:04:09.238 TEST_HEADER include/spdk/env_dpdk.h 00:04:09.238 TEST_HEADER include/spdk/endian.h 00:04:09.238 TEST_HEADER include/spdk/event.h 00:04:09.238 TEST_HEADER include/spdk/env.h 00:04:09.238 TEST_HEADER include/spdk/fd_group.h 00:04:09.238 TEST_HEADER include/spdk/fd.h 00:04:09.238 TEST_HEADER include/spdk/file.h 00:04:09.238 TEST_HEADER include/spdk/fsdev.h 00:04:09.238 TEST_HEADER include/spdk/fsdev_module.h 00:04:09.238 TEST_HEADER include/spdk/ftl.h 00:04:09.238 CC app/spdk_nvme_identify/identify.o 00:04:09.238 CC examples/interrupt_tgt/interrupt_tgt.o 00:04:09.238 TEST_HEADER include/spdk/fuse_dispatcher.h 00:04:09.238 TEST_HEADER include/spdk/gpt_spec.h 00:04:09.238 TEST_HEADER include/spdk/idxd.h 00:04:09.238 TEST_HEADER include/spdk/hexlify.h 00:04:09.238 TEST_HEADER include/spdk/histogram_data.h 00:04:09.238 TEST_HEADER include/spdk/idxd_spec.h 00:04:09.238 TEST_HEADER include/spdk/init.h 00:04:09.238 TEST_HEADER include/spdk/ioat.h 00:04:09.507 TEST_HEADER include/spdk/ioat_spec.h 00:04:09.507 TEST_HEADER include/spdk/json.h 00:04:09.507 TEST_HEADER include/spdk/iscsi_spec.h 00:04:09.507 TEST_HEADER include/spdk/jsonrpc.h 00:04:09.507 TEST_HEADER include/spdk/likely.h 00:04:09.507 CC app/nvmf_tgt/nvmf_main.o 00:04:09.507 TEST_HEADER include/spdk/keyring.h 00:04:09.507 TEST_HEADER include/spdk/lvol.h 00:04:09.507 TEST_HEADER include/spdk/keyring_module.h 00:04:09.507 TEST_HEADER include/spdk/memory.h 00:04:09.507 TEST_HEADER include/spdk/log.h 00:04:09.507 TEST_HEADER include/spdk/nbd.h 00:04:09.507 TEST_HEADER include/spdk/md5.h 00:04:09.507 TEST_HEADER include/spdk/net.h 00:04:09.507 TEST_HEADER include/spdk/mmio.h 00:04:09.507 TEST_HEADER include/spdk/nvme.h 00:04:09.507 TEST_HEADER include/spdk/notify.h 00:04:09.507 TEST_HEADER include/spdk/nvme_ocssd.h 00:04:09.507 TEST_HEADER include/spdk/nvme_intel.h 00:04:09.507 TEST_HEADER include/spdk/nvme_zns.h 00:04:09.507 TEST_HEADER include/spdk/nvmf_cmd.h 00:04:09.507 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:04:09.507 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:04:09.507 TEST_HEADER include/spdk/nvme_spec.h 00:04:09.507 TEST_HEADER include/spdk/nvmf.h 00:04:09.507 CC app/iscsi_tgt/iscsi_tgt.o 00:04:09.507 TEST_HEADER include/spdk/nvmf_transport.h 00:04:09.507 TEST_HEADER include/spdk/nvmf_spec.h 00:04:09.507 TEST_HEADER include/spdk/opal_spec.h 00:04:09.507 CC app/spdk_dd/spdk_dd.o 00:04:09.507 TEST_HEADER include/spdk/opal.h 00:04:09.507 TEST_HEADER include/spdk/pipe.h 00:04:09.507 TEST_HEADER include/spdk/pci_ids.h 00:04:09.507 TEST_HEADER include/spdk/reduce.h 00:04:09.507 TEST_HEADER include/spdk/queue.h 00:04:09.507 TEST_HEADER include/spdk/scheduler.h 00:04:09.507 TEST_HEADER include/spdk/scsi.h 00:04:09.507 TEST_HEADER include/spdk/scsi_spec.h 00:04:09.507 TEST_HEADER include/spdk/sock.h 00:04:09.507 TEST_HEADER include/spdk/rpc.h 00:04:09.507 TEST_HEADER include/spdk/string.h 00:04:09.507 TEST_HEADER include/spdk/trace.h 00:04:09.507 TEST_HEADER include/spdk/trace_parser.h 00:04:09.507 TEST_HEADER include/spdk/stdinc.h 00:04:09.507 TEST_HEADER include/spdk/tree.h 00:04:09.507 TEST_HEADER include/spdk/thread.h 00:04:09.507 TEST_HEADER include/spdk/uuid.h 00:04:09.507 TEST_HEADER include/spdk/util.h 00:04:09.507 TEST_HEADER include/spdk/version.h 00:04:09.507 TEST_HEADER include/spdk/ublk.h 00:04:09.507 TEST_HEADER include/spdk/vfio_user_pci.h 00:04:09.507 TEST_HEADER include/spdk/vfio_user_spec.h 00:04:09.507 TEST_HEADER include/spdk/xor.h 00:04:09.507 TEST_HEADER include/spdk/vmd.h 00:04:09.507 CXX test/cpp_headers/accel.o 00:04:09.507 CXX test/cpp_headers/accel_module.o 00:04:09.507 TEST_HEADER include/spdk/vhost.h 00:04:09.507 TEST_HEADER include/spdk/zipf.h 00:04:09.507 CXX test/cpp_headers/assert.o 00:04:09.507 CXX test/cpp_headers/barrier.o 00:04:09.507 CXX test/cpp_headers/base64.o 00:04:09.507 CXX test/cpp_headers/bdev.o 00:04:09.507 CXX test/cpp_headers/bdev_module.o 00:04:09.507 CXX test/cpp_headers/bit_array.o 00:04:09.507 CXX test/cpp_headers/bdev_zone.o 00:04:09.507 CXX test/cpp_headers/blobfs_bdev.o 00:04:09.507 CXX test/cpp_headers/blob_bdev.o 00:04:09.507 CXX test/cpp_headers/bit_pool.o 00:04:09.507 CC app/spdk_tgt/spdk_tgt.o 00:04:09.507 CXX test/cpp_headers/conf.o 00:04:09.507 CXX test/cpp_headers/blobfs.o 00:04:09.507 CXX test/cpp_headers/blob.o 00:04:09.507 CXX test/cpp_headers/crc32.o 00:04:09.507 CXX test/cpp_headers/crc16.o 00:04:09.507 CXX test/cpp_headers/config.o 00:04:09.507 CXX test/cpp_headers/cpuset.o 00:04:09.507 CXX test/cpp_headers/crc64.o 00:04:09.507 CXX test/cpp_headers/endian.o 00:04:09.507 CXX test/cpp_headers/env_dpdk.o 00:04:09.507 CXX test/cpp_headers/dif.o 00:04:09.507 CXX test/cpp_headers/dma.o 00:04:09.507 CXX test/cpp_headers/env.o 00:04:09.507 CXX test/cpp_headers/event.o 00:04:09.507 CXX test/cpp_headers/fd.o 00:04:09.507 CXX test/cpp_headers/fd_group.o 00:04:09.507 CXX test/cpp_headers/file.o 00:04:09.507 CXX test/cpp_headers/fsdev.o 00:04:09.507 CXX test/cpp_headers/fsdev_module.o 00:04:09.507 CXX test/cpp_headers/ftl.o 00:04:09.507 CXX test/cpp_headers/gpt_spec.o 00:04:09.507 CXX test/cpp_headers/fuse_dispatcher.o 00:04:09.507 CXX test/cpp_headers/histogram_data.o 00:04:09.507 CXX test/cpp_headers/hexlify.o 00:04:09.507 CXX test/cpp_headers/idxd.o 00:04:09.507 CXX test/cpp_headers/init.o 00:04:09.507 CXX test/cpp_headers/ioat.o 00:04:09.507 CXX test/cpp_headers/idxd_spec.o 00:04:09.507 CXX test/cpp_headers/ioat_spec.o 00:04:09.507 CXX test/cpp_headers/iscsi_spec.o 00:04:09.507 CXX test/cpp_headers/jsonrpc.o 00:04:09.507 CXX test/cpp_headers/json.o 00:04:09.507 CXX test/cpp_headers/keyring.o 00:04:09.507 CXX test/cpp_headers/keyring_module.o 00:04:09.507 CXX test/cpp_headers/likely.o 00:04:09.507 CXX test/cpp_headers/log.o 00:04:09.507 CXX test/cpp_headers/lvol.o 00:04:09.507 CXX test/cpp_headers/md5.o 00:04:09.507 CC examples/ioat/verify/verify.o 00:04:09.507 CXX test/cpp_headers/memory.o 00:04:09.507 CXX test/cpp_headers/nbd.o 00:04:09.507 CXX test/cpp_headers/mmio.o 00:04:09.507 CXX test/cpp_headers/notify.o 00:04:09.507 CXX test/cpp_headers/net.o 00:04:09.507 CXX test/cpp_headers/nvme.o 00:04:09.507 CXX test/cpp_headers/nvme_intel.o 00:04:09.507 CXX test/cpp_headers/nvme_ocssd.o 00:04:09.507 CXX test/cpp_headers/nvme_ocssd_spec.o 00:04:09.507 CXX test/cpp_headers/nvme_spec.o 00:04:09.507 CXX test/cpp_headers/nvme_zns.o 00:04:09.507 CXX test/cpp_headers/nvmf_cmd.o 00:04:09.507 CC examples/util/zipf/zipf.o 00:04:09.507 CXX test/cpp_headers/nvmf_fc_spec.o 00:04:09.507 CXX test/cpp_headers/nvmf_spec.o 00:04:09.507 CXX test/cpp_headers/nvmf.o 00:04:09.507 CXX test/cpp_headers/opal.o 00:04:09.507 CXX test/cpp_headers/opal_spec.o 00:04:09.507 CXX test/cpp_headers/nvmf_transport.o 00:04:09.507 CXX test/cpp_headers/pci_ids.o 00:04:09.507 CXX test/cpp_headers/queue.o 00:04:09.507 CXX test/cpp_headers/pipe.o 00:04:09.507 CXX test/cpp_headers/reduce.o 00:04:09.507 CXX test/cpp_headers/rpc.o 00:04:09.507 CXX test/cpp_headers/scheduler.o 00:04:09.507 CC test/env/pci/pci_ut.o 00:04:09.507 CC examples/ioat/perf/perf.o 00:04:09.507 CC test/app/jsoncat/jsoncat.o 00:04:09.507 CXX test/cpp_headers/scsi_spec.o 00:04:09.507 CXX test/cpp_headers/scsi.o 00:04:09.507 CXX test/cpp_headers/sock.o 00:04:09.507 CXX test/cpp_headers/stdinc.o 00:04:09.507 CXX test/cpp_headers/string.o 00:04:09.507 CXX test/cpp_headers/trace.o 00:04:09.507 CXX test/cpp_headers/thread.o 00:04:09.507 CXX test/cpp_headers/trace_parser.o 00:04:09.507 CC test/env/vtophys/vtophys.o 00:04:09.507 LINK spdk_lspci 00:04:09.507 CC test/app/stub/stub.o 00:04:09.507 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:04:09.507 CC test/env/memory/memory_ut.o 00:04:09.507 CC test/app/histogram_perf/histogram_perf.o 00:04:09.507 CC app/fio/nvme/fio_plugin.o 00:04:09.507 CC test/thread/poller_perf/poller_perf.o 00:04:09.507 CC test/thread/lock/spdk_lock.o 00:04:09.507 CXX test/cpp_headers/tree.o 00:04:09.507 CC test/dma/test_dma/test_dma.o 00:04:09.507 LINK rpc_client_test 00:04:09.507 CC test/app/bdev_svc/bdev_svc.o 00:04:09.507 CXX test/cpp_headers/ublk.o 00:04:09.507 LINK spdk_nvme_discover 00:04:09.507 CC app/fio/bdev/fio_plugin.o 00:04:09.507 CC test/env/mem_callbacks/mem_callbacks.o 00:04:09.507 LINK interrupt_tgt 00:04:09.507 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:04:09.507 LINK nvmf_tgt 00:04:09.507 LINK spdk_trace_record 00:04:09.507 CXX test/cpp_headers/util.o 00:04:09.766 CXX test/cpp_headers/uuid.o 00:04:09.766 CXX test/cpp_headers/version.o 00:04:09.766 CXX test/cpp_headers/vfio_user_pci.o 00:04:09.766 CXX test/cpp_headers/vfio_user_spec.o 00:04:09.766 CXX test/cpp_headers/vhost.o 00:04:09.766 CXX test/cpp_headers/vmd.o 00:04:09.766 CXX test/cpp_headers/xor.o 00:04:09.766 CXX test/cpp_headers/zipf.o 00:04:09.766 LINK jsoncat 00:04:09.766 LINK zipf 00:04:09.766 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:04:09.766 LINK iscsi_tgt 00:04:09.766 LINK vtophys 00:04:09.766 LINK histogram_perf 00:04:09.766 LINK poller_perf 00:04:09.766 LINK env_dpdk_post_init 00:04:09.766 LINK verify 00:04:09.766 LINK stub 00:04:09.766 LINK spdk_tgt 00:04:09.766 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:04:09.766 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:04:09.766 LINK bdev_svc 00:04:09.766 LINK ioat_perf 00:04:09.766 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:04:09.766 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:04:09.766 LINK spdk_trace 00:04:09.766 LINK pci_ut 00:04:10.025 LINK nvme_fuzz 00:04:10.025 LINK test_dma 00:04:10.025 LINK spdk_dd 00:04:10.025 LINK spdk_nvme_perf 00:04:10.025 LINK spdk_nvme_identify 00:04:10.025 LINK mem_callbacks 00:04:10.025 LINK vhost_fuzz 00:04:10.025 LINK llvm_vfio_fuzz 00:04:10.025 LINK spdk_bdev 00:04:10.025 LINK spdk_nvme 00:04:10.025 LINK spdk_top 00:04:10.283 LINK llvm_nvme_fuzz 00:04:10.283 CC app/vhost/vhost.o 00:04:10.283 CC examples/sock/hello_world/hello_sock.o 00:04:10.283 CC examples/idxd/perf/perf.o 00:04:10.283 CC examples/vmd/led/led.o 00:04:10.283 CC examples/vmd/lsvmd/lsvmd.o 00:04:10.283 CC examples/thread/thread/thread_ex.o 00:04:10.283 LINK memory_ut 00:04:10.542 LINK lsvmd 00:04:10.542 LINK led 00:04:10.542 LINK vhost 00:04:10.542 LINK hello_sock 00:04:10.542 LINK spdk_lock 00:04:10.542 LINK idxd_perf 00:04:10.542 LINK thread 00:04:10.801 LINK iscsi_fuzz 00:04:11.059 CC examples/nvme/hello_world/hello_world.o 00:04:11.059 CC examples/nvme/cmb_copy/cmb_copy.o 00:04:11.059 CC examples/nvme/arbitration/arbitration.o 00:04:11.059 CC examples/nvme/hotplug/hotplug.o 00:04:11.318 CC examples/nvme/nvme_manage/nvme_manage.o 00:04:11.318 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:04:11.318 CC test/event/event_perf/event_perf.o 00:04:11.318 CC examples/nvme/reconnect/reconnect.o 00:04:11.318 CC test/event/reactor_perf/reactor_perf.o 00:04:11.318 CC examples/nvme/abort/abort.o 00:04:11.318 CC test/event/reactor/reactor.o 00:04:11.318 CC test/event/scheduler/scheduler.o 00:04:11.318 CC test/event/app_repeat/app_repeat.o 00:04:11.318 LINK event_perf 00:04:11.318 LINK reactor_perf 00:04:11.318 LINK reactor 00:04:11.318 LINK pmr_persistence 00:04:11.318 LINK hotplug 00:04:11.318 LINK hello_world 00:04:11.318 LINK cmb_copy 00:04:11.318 LINK app_repeat 00:04:11.318 LINK scheduler 00:04:11.318 LINK arbitration 00:04:11.318 LINK reconnect 00:04:11.318 LINK abort 00:04:11.577 LINK nvme_manage 00:04:11.577 CC test/nvme/e2edp/nvme_dp.o 00:04:11.577 CC test/nvme/connect_stress/connect_stress.o 00:04:11.577 CC test/nvme/err_injection/err_injection.o 00:04:11.577 CC test/nvme/overhead/overhead.o 00:04:11.577 CC test/nvme/reserve/reserve.o 00:04:11.577 CC test/nvme/sgl/sgl.o 00:04:11.577 CC test/nvme/fused_ordering/fused_ordering.o 00:04:11.577 CC test/nvme/aer/aer.o 00:04:11.577 CC test/nvme/fdp/fdp.o 00:04:11.577 CC test/nvme/startup/startup.o 00:04:11.577 CC test/nvme/doorbell_aers/doorbell_aers.o 00:04:11.577 CC test/nvme/cuse/cuse.o 00:04:11.577 CC test/nvme/simple_copy/simple_copy.o 00:04:11.577 CC test/nvme/reset/reset.o 00:04:11.577 CC test/nvme/compliance/nvme_compliance.o 00:04:11.577 CC test/nvme/boot_partition/boot_partition.o 00:04:11.577 CC test/accel/dif/dif.o 00:04:11.577 CC test/blobfs/mkfs/mkfs.o 00:04:11.836 CC test/lvol/esnap/esnap.o 00:04:11.836 LINK connect_stress 00:04:11.836 LINK startup 00:04:11.836 LINK err_injection 00:04:11.836 LINK fused_ordering 00:04:11.836 LINK reserve 00:04:11.836 LINK doorbell_aers 00:04:11.836 LINK boot_partition 00:04:11.836 LINK nvme_dp 00:04:11.836 LINK simple_copy 00:04:11.836 LINK sgl 00:04:11.836 LINK fdp 00:04:11.836 LINK overhead 00:04:11.836 LINK mkfs 00:04:11.836 LINK reset 00:04:11.836 LINK aer 00:04:11.836 LINK nvme_compliance 00:04:12.096 LINK dif 00:04:12.355 CC examples/accel/perf/accel_perf.o 00:04:12.355 CC examples/blob/hello_world/hello_blob.o 00:04:12.355 CC examples/blob/cli/blobcli.o 00:04:12.355 CC examples/fsdev/hello_world/hello_fsdev.o 00:04:12.614 LINK cuse 00:04:12.614 LINK hello_blob 00:04:12.614 LINK hello_fsdev 00:04:12.614 LINK accel_perf 00:04:12.614 LINK blobcli 00:04:13.550 CC examples/bdev/bdevperf/bdevperf.o 00:04:13.550 CC examples/bdev/hello_world/hello_bdev.o 00:04:13.550 LINK hello_bdev 00:04:13.550 CC test/bdev/bdevio/bdevio.o 00:04:13.809 LINK bdevperf 00:04:13.809 LINK bdevio 00:04:15.187 LINK esnap 00:04:15.187 CC examples/nvmf/nvmf/nvmf.o 00:04:15.445 LINK nvmf 00:04:16.823 00:04:16.823 real 0m36.588s 00:04:16.823 user 4m39.090s 00:04:16.823 sys 1m43.108s 00:04:16.823 19:17:36 make -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:04:16.823 19:17:36 make -- common/autotest_common.sh@10 -- $ set +x 00:04:16.823 ************************************ 00:04:16.823 END TEST make 00:04:16.823 ************************************ 00:04:16.823 19:17:36 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:04:16.823 19:17:36 -- pm/common@29 -- $ signal_monitor_resources TERM 00:04:16.823 19:17:36 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:04:16.823 19:17:36 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:16.823 19:17:36 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:04:16.823 19:17:36 -- pm/common@44 -- $ pid=1605280 00:04:16.823 19:17:36 -- pm/common@50 -- $ kill -TERM 1605280 00:04:16.823 19:17:36 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:16.823 19:17:36 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:04:16.823 19:17:36 -- pm/common@44 -- $ pid=1605282 00:04:16.823 19:17:36 -- pm/common@50 -- $ kill -TERM 1605282 00:04:16.823 19:17:36 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:16.823 19:17:36 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:04:16.823 19:17:36 -- pm/common@44 -- $ pid=1605284 00:04:16.823 19:17:36 -- pm/common@50 -- $ kill -TERM 1605284 00:04:16.823 19:17:36 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:16.823 19:17:36 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:04:16.823 19:17:36 -- pm/common@44 -- $ pid=1605302 00:04:16.823 19:17:36 -- pm/common@50 -- $ sudo -E kill -TERM 1605302 00:04:16.823 19:17:36 -- spdk/autorun.sh@26 -- $ (( SPDK_TEST_UNITTEST == 1 || SPDK_RUN_FUNCTIONAL_TEST == 1 )) 00:04:16.823 19:17:36 -- spdk/autorun.sh@27 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:04:16.823 19:17:36 -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:16.823 19:17:36 -- common/autotest_common.sh@1693 -- # lcov --version 00:04:16.823 19:17:36 -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:17.083 19:17:36 -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:17.083 19:17:36 -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:17.083 19:17:36 -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:17.083 19:17:36 -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:17.083 19:17:36 -- scripts/common.sh@336 -- # IFS=.-: 00:04:17.083 19:17:36 -- scripts/common.sh@336 -- # read -ra ver1 00:04:17.083 19:17:36 -- scripts/common.sh@337 -- # IFS=.-: 00:04:17.083 19:17:36 -- scripts/common.sh@337 -- # read -ra ver2 00:04:17.083 19:17:36 -- scripts/common.sh@338 -- # local 'op=<' 00:04:17.083 19:17:36 -- scripts/common.sh@340 -- # ver1_l=2 00:04:17.083 19:17:36 -- scripts/common.sh@341 -- # ver2_l=1 00:04:17.083 19:17:36 -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:17.083 19:17:36 -- scripts/common.sh@344 -- # case "$op" in 00:04:17.083 19:17:36 -- scripts/common.sh@345 -- # : 1 00:04:17.083 19:17:36 -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:17.083 19:17:36 -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:17.083 19:17:36 -- scripts/common.sh@365 -- # decimal 1 00:04:17.083 19:17:36 -- scripts/common.sh@353 -- # local d=1 00:04:17.083 19:17:36 -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:17.083 19:17:36 -- scripts/common.sh@355 -- # echo 1 00:04:17.083 19:17:36 -- scripts/common.sh@365 -- # ver1[v]=1 00:04:17.083 19:17:36 -- scripts/common.sh@366 -- # decimal 2 00:04:17.083 19:17:36 -- scripts/common.sh@353 -- # local d=2 00:04:17.083 19:17:36 -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:17.083 19:17:36 -- scripts/common.sh@355 -- # echo 2 00:04:17.083 19:17:36 -- scripts/common.sh@366 -- # ver2[v]=2 00:04:17.083 19:17:36 -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:17.083 19:17:36 -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:17.083 19:17:36 -- scripts/common.sh@368 -- # return 0 00:04:17.083 19:17:36 -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:17.083 19:17:36 -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:17.083 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:17.083 --rc genhtml_branch_coverage=1 00:04:17.083 --rc genhtml_function_coverage=1 00:04:17.083 --rc genhtml_legend=1 00:04:17.083 --rc geninfo_all_blocks=1 00:04:17.083 --rc geninfo_unexecuted_blocks=1 00:04:17.083 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:17.083 ' 00:04:17.083 19:17:36 -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:17.083 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:17.083 --rc genhtml_branch_coverage=1 00:04:17.083 --rc genhtml_function_coverage=1 00:04:17.083 --rc genhtml_legend=1 00:04:17.083 --rc geninfo_all_blocks=1 00:04:17.083 --rc geninfo_unexecuted_blocks=1 00:04:17.083 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:17.083 ' 00:04:17.083 19:17:36 -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:17.083 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:17.083 --rc genhtml_branch_coverage=1 00:04:17.083 --rc genhtml_function_coverage=1 00:04:17.083 --rc genhtml_legend=1 00:04:17.083 --rc geninfo_all_blocks=1 00:04:17.083 --rc geninfo_unexecuted_blocks=1 00:04:17.083 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:17.083 ' 00:04:17.083 19:17:36 -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:17.083 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:17.083 --rc genhtml_branch_coverage=1 00:04:17.083 --rc genhtml_function_coverage=1 00:04:17.083 --rc genhtml_legend=1 00:04:17.083 --rc geninfo_all_blocks=1 00:04:17.083 --rc geninfo_unexecuted_blocks=1 00:04:17.083 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:17.084 ' 00:04:17.084 19:17:36 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:04:17.084 19:17:36 -- nvmf/common.sh@7 -- # uname -s 00:04:17.084 19:17:36 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:17.084 19:17:36 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:17.084 19:17:36 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:17.084 19:17:36 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:17.084 19:17:36 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:04:17.084 19:17:36 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:04:17.084 19:17:36 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:17.084 19:17:36 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:04:17.084 19:17:36 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:17.084 19:17:36 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:04:17.084 19:17:36 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:04:17.084 19:17:36 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:04:17.084 19:17:36 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:17.084 19:17:36 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:04:17.084 19:17:36 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:04:17.084 19:17:36 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:04:17.084 19:17:36 -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:04:17.084 19:17:36 -- scripts/common.sh@15 -- # shopt -s extglob 00:04:17.084 19:17:36 -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:17.084 19:17:36 -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:17.084 19:17:36 -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:17.084 19:17:36 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:17.084 19:17:36 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:17.084 19:17:36 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:17.084 19:17:36 -- paths/export.sh@5 -- # export PATH 00:04:17.084 19:17:36 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:17.084 19:17:36 -- nvmf/common.sh@51 -- # : 0 00:04:17.084 19:17:36 -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:04:17.084 19:17:36 -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:04:17.084 19:17:36 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:04:17.084 19:17:36 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:17.084 19:17:36 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:17.084 19:17:36 -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:04:17.084 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:04:17.084 19:17:36 -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:04:17.084 19:17:36 -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:04:17.084 19:17:36 -- nvmf/common.sh@55 -- # have_pci_nics=0 00:04:17.084 19:17:36 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:04:17.084 19:17:36 -- spdk/autotest.sh@32 -- # uname -s 00:04:17.084 19:17:36 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:04:17.084 19:17:36 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:04:17.084 19:17:36 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:04:17.084 19:17:36 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:04:17.084 19:17:36 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:04:17.084 19:17:36 -- spdk/autotest.sh@44 -- # modprobe nbd 00:04:17.084 19:17:36 -- spdk/autotest.sh@46 -- # type -P udevadm 00:04:17.084 19:17:36 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:04:17.084 19:17:36 -- spdk/autotest.sh@48 -- # udevadm_pid=1683633 00:04:17.084 19:17:36 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:04:17.084 19:17:36 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:04:17.084 19:17:36 -- pm/common@17 -- # local monitor 00:04:17.084 19:17:36 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:17.084 19:17:36 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:17.084 19:17:36 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:17.084 19:17:36 -- pm/common@21 -- # date +%s 00:04:17.084 19:17:36 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:17.084 19:17:36 -- pm/common@21 -- # date +%s 00:04:17.084 19:17:36 -- pm/common@25 -- # sleep 1 00:04:17.084 19:17:36 -- pm/common@21 -- # date +%s 00:04:17.084 19:17:36 -- pm/common@21 -- # date +%s 00:04:17.084 19:17:36 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732904256 00:04:17.084 19:17:36 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732904256 00:04:17.084 19:17:36 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732904256 00:04:17.084 19:17:36 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732904256 00:04:17.084 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732904256_collect-vmstat.pm.log 00:04:17.084 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732904256_collect-cpu-load.pm.log 00:04:17.084 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732904256_collect-cpu-temp.pm.log 00:04:17.084 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732904256_collect-bmc-pm.bmc.pm.log 00:04:18.022 19:17:37 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:04:18.022 19:17:37 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:04:18.022 19:17:37 -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:18.022 19:17:37 -- common/autotest_common.sh@10 -- # set +x 00:04:18.022 19:17:37 -- spdk/autotest.sh@59 -- # create_test_list 00:04:18.022 19:17:37 -- common/autotest_common.sh@752 -- # xtrace_disable 00:04:18.022 19:17:37 -- common/autotest_common.sh@10 -- # set +x 00:04:18.022 19:17:37 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:04:18.022 19:17:37 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:18.302 19:17:37 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:18.302 19:17:37 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:04:18.302 19:17:37 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:18.302 19:17:37 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:04:18.302 19:17:37 -- common/autotest_common.sh@1457 -- # uname 00:04:18.302 19:17:37 -- common/autotest_common.sh@1457 -- # '[' Linux = FreeBSD ']' 00:04:18.302 19:17:37 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:04:18.302 19:17:37 -- common/autotest_common.sh@1477 -- # uname 00:04:18.302 19:17:37 -- common/autotest_common.sh@1477 -- # [[ Linux = FreeBSD ]] 00:04:18.302 19:17:37 -- spdk/autotest.sh@68 -- # [[ y == y ]] 00:04:18.302 19:17:37 -- spdk/autotest.sh@70 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:04:18.302 lcov: LCOV version 1.15 00:04:18.302 19:17:37 -- spdk/autotest.sh@72 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:04:26.419 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:04:26.419 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcno 00:04:34.644 19:17:53 -- spdk/autotest.sh@76 -- # timing_enter pre_cleanup 00:04:34.644 19:17:53 -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:34.644 19:17:53 -- common/autotest_common.sh@10 -- # set +x 00:04:34.644 19:17:53 -- spdk/autotest.sh@78 -- # rm -f 00:04:34.644 19:17:53 -- spdk/autotest.sh@81 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:37.204 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:04:37.204 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:04:37.204 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:04:37.204 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:04:37.204 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:04:37.204 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:04:37.204 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:04:37.204 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:04:37.204 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:04:37.204 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:04:37.204 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:04:37.204 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:04:37.464 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:04:37.464 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:04:37.464 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:04:37.464 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:04:37.464 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:04:37.464 19:17:57 -- spdk/autotest.sh@83 -- # get_zoned_devs 00:04:37.464 19:17:57 -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:04:37.464 19:17:57 -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:04:37.464 19:17:57 -- common/autotest_common.sh@1658 -- # local nvme bdf 00:04:37.464 19:17:57 -- common/autotest_common.sh@1660 -- # for nvme in /sys/block/nvme* 00:04:37.464 19:17:57 -- common/autotest_common.sh@1661 -- # is_block_zoned nvme0n1 00:04:37.464 19:17:57 -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:04:37.464 19:17:57 -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:37.464 19:17:57 -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:04:37.464 19:17:57 -- spdk/autotest.sh@85 -- # (( 0 > 0 )) 00:04:37.464 19:17:57 -- spdk/autotest.sh@97 -- # for dev in /dev/nvme*n!(*p*) 00:04:37.464 19:17:57 -- spdk/autotest.sh@99 -- # [[ -z '' ]] 00:04:37.464 19:17:57 -- spdk/autotest.sh@100 -- # block_in_use /dev/nvme0n1 00:04:37.464 19:17:57 -- scripts/common.sh@381 -- # local block=/dev/nvme0n1 pt 00:04:37.464 19:17:57 -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:04:37.464 No valid GPT data, bailing 00:04:37.464 19:17:57 -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:37.464 19:17:57 -- scripts/common.sh@394 -- # pt= 00:04:37.464 19:17:57 -- scripts/common.sh@395 -- # return 1 00:04:37.464 19:17:57 -- spdk/autotest.sh@101 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:04:37.464 1+0 records in 00:04:37.464 1+0 records out 00:04:37.464 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00229767 s, 456 MB/s 00:04:37.464 19:17:57 -- spdk/autotest.sh@105 -- # sync 00:04:37.464 19:17:57 -- spdk/autotest.sh@107 -- # xtrace_disable_per_cmd reap_spdk_processes 00:04:37.464 19:17:57 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:04:37.464 19:17:57 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:45.586 19:18:04 -- spdk/autotest.sh@111 -- # uname -s 00:04:45.586 19:18:04 -- spdk/autotest.sh@111 -- # [[ Linux == Linux ]] 00:04:45.586 19:18:04 -- spdk/autotest.sh@111 -- # [[ 1 -eq 1 ]] 00:04:45.586 19:18:04 -- spdk/autotest.sh@112 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:45.586 19:18:04 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:45.586 19:18:04 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:45.586 19:18:04 -- common/autotest_common.sh@10 -- # set +x 00:04:45.586 ************************************ 00:04:45.586 START TEST setup.sh 00:04:45.586 ************************************ 00:04:45.586 19:18:04 setup.sh -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:45.586 * Looking for test storage... 00:04:45.586 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:45.586 19:18:04 setup.sh -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:45.586 19:18:04 setup.sh -- common/autotest_common.sh@1693 -- # lcov --version 00:04:45.586 19:18:04 setup.sh -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:45.586 19:18:04 setup.sh -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@336 -- # IFS=.-: 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@336 -- # read -ra ver1 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@337 -- # IFS=.-: 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@337 -- # read -ra ver2 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@338 -- # local 'op=<' 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@340 -- # ver1_l=2 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@341 -- # ver2_l=1 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@344 -- # case "$op" in 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@345 -- # : 1 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@365 -- # decimal 1 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@353 -- # local d=1 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@355 -- # echo 1 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@365 -- # ver1[v]=1 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@366 -- # decimal 2 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@353 -- # local d=2 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@355 -- # echo 2 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@366 -- # ver2[v]=2 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:45.586 19:18:04 setup.sh -- scripts/common.sh@368 -- # return 0 00:04:45.586 19:18:04 setup.sh -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:45.586 19:18:04 setup.sh -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:45.586 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:45.586 --rc genhtml_branch_coverage=1 00:04:45.586 --rc genhtml_function_coverage=1 00:04:45.586 --rc genhtml_legend=1 00:04:45.586 --rc geninfo_all_blocks=1 00:04:45.586 --rc geninfo_unexecuted_blocks=1 00:04:45.586 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:45.586 ' 00:04:45.586 19:18:04 setup.sh -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:45.586 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:45.586 --rc genhtml_branch_coverage=1 00:04:45.586 --rc genhtml_function_coverage=1 00:04:45.586 --rc genhtml_legend=1 00:04:45.586 --rc geninfo_all_blocks=1 00:04:45.586 --rc geninfo_unexecuted_blocks=1 00:04:45.586 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:45.586 ' 00:04:45.586 19:18:04 setup.sh -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:45.586 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:45.586 --rc genhtml_branch_coverage=1 00:04:45.586 --rc genhtml_function_coverage=1 00:04:45.586 --rc genhtml_legend=1 00:04:45.586 --rc geninfo_all_blocks=1 00:04:45.586 --rc geninfo_unexecuted_blocks=1 00:04:45.586 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:45.586 ' 00:04:45.586 19:18:04 setup.sh -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:45.586 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:45.586 --rc genhtml_branch_coverage=1 00:04:45.586 --rc genhtml_function_coverage=1 00:04:45.586 --rc genhtml_legend=1 00:04:45.586 --rc geninfo_all_blocks=1 00:04:45.586 --rc geninfo_unexecuted_blocks=1 00:04:45.586 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:45.586 ' 00:04:45.586 19:18:04 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:04:45.586 19:18:04 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:04:45.586 19:18:04 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:45.586 19:18:04 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:45.586 19:18:04 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:45.586 19:18:04 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:45.586 ************************************ 00:04:45.586 START TEST acl 00:04:45.586 ************************************ 00:04:45.586 19:18:04 setup.sh.acl -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:45.586 * Looking for test storage... 00:04:45.586 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:45.586 19:18:04 setup.sh.acl -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:45.586 19:18:04 setup.sh.acl -- common/autotest_common.sh@1693 -- # lcov --version 00:04:45.586 19:18:04 setup.sh.acl -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:45.586 19:18:04 setup.sh.acl -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@336 -- # IFS=.-: 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@336 -- # read -ra ver1 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@337 -- # IFS=.-: 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@337 -- # read -ra ver2 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@338 -- # local 'op=<' 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@340 -- # ver1_l=2 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@341 -- # ver2_l=1 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@344 -- # case "$op" in 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@345 -- # : 1 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@365 -- # decimal 1 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@353 -- # local d=1 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@355 -- # echo 1 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@365 -- # ver1[v]=1 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@366 -- # decimal 2 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@353 -- # local d=2 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@355 -- # echo 2 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@366 -- # ver2[v]=2 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:45.586 19:18:04 setup.sh.acl -- scripts/common.sh@368 -- # return 0 00:04:45.586 19:18:04 setup.sh.acl -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:45.586 19:18:04 setup.sh.acl -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:45.586 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:45.586 --rc genhtml_branch_coverage=1 00:04:45.586 --rc genhtml_function_coverage=1 00:04:45.586 --rc genhtml_legend=1 00:04:45.586 --rc geninfo_all_blocks=1 00:04:45.586 --rc geninfo_unexecuted_blocks=1 00:04:45.586 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:45.586 ' 00:04:45.586 19:18:04 setup.sh.acl -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:45.586 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:45.586 --rc genhtml_branch_coverage=1 00:04:45.586 --rc genhtml_function_coverage=1 00:04:45.586 --rc genhtml_legend=1 00:04:45.586 --rc geninfo_all_blocks=1 00:04:45.586 --rc geninfo_unexecuted_blocks=1 00:04:45.586 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:45.586 ' 00:04:45.586 19:18:04 setup.sh.acl -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:45.586 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:45.586 --rc genhtml_branch_coverage=1 00:04:45.586 --rc genhtml_function_coverage=1 00:04:45.586 --rc genhtml_legend=1 00:04:45.586 --rc geninfo_all_blocks=1 00:04:45.586 --rc geninfo_unexecuted_blocks=1 00:04:45.586 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:45.586 ' 00:04:45.586 19:18:04 setup.sh.acl -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:45.586 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:45.586 --rc genhtml_branch_coverage=1 00:04:45.586 --rc genhtml_function_coverage=1 00:04:45.586 --rc genhtml_legend=1 00:04:45.586 --rc geninfo_all_blocks=1 00:04:45.586 --rc geninfo_unexecuted_blocks=1 00:04:45.586 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:45.586 ' 00:04:45.586 19:18:04 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:04:45.586 19:18:04 setup.sh.acl -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:04:45.586 19:18:04 setup.sh.acl -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:04:45.586 19:18:04 setup.sh.acl -- common/autotest_common.sh@1658 -- # local nvme bdf 00:04:45.586 19:18:04 setup.sh.acl -- common/autotest_common.sh@1660 -- # for nvme in /sys/block/nvme* 00:04:45.586 19:18:04 setup.sh.acl -- common/autotest_common.sh@1661 -- # is_block_zoned nvme0n1 00:04:45.586 19:18:04 setup.sh.acl -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:04:45.586 19:18:04 setup.sh.acl -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:45.586 19:18:04 setup.sh.acl -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:04:45.586 19:18:04 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:04:45.586 19:18:04 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:04:45.586 19:18:04 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:04:45.586 19:18:04 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:04:45.586 19:18:04 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:04:45.586 19:18:04 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:45.586 19:18:04 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:48.879 19:18:08 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:04:48.879 19:18:08 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:04:48.879 19:18:08 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:48.879 19:18:08 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:04:48.879 19:18:08 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:04:48.879 19:18:08 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:52.174 Hugepages 00:04:52.174 node hugesize free / total 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.174 00:04:52.174 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:52.174 19:18:11 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:52.174 19:18:11 setup.sh.acl -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:52.174 19:18:11 setup.sh.acl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:52.175 19:18:11 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:52.175 ************************************ 00:04:52.175 START TEST denied 00:04:52.175 ************************************ 00:04:52.175 19:18:11 setup.sh.acl.denied -- common/autotest_common.sh@1129 -- # denied 00:04:52.175 19:18:11 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:04:52.175 19:18:11 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:52.175 19:18:11 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:04:52.175 19:18:11 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:52.175 19:18:11 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:56.369 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:04:56.369 19:18:15 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:04:56.369 19:18:15 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:56.369 19:18:15 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:56.369 19:18:15 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:04:56.369 19:18:15 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:04:56.369 19:18:15 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:56.369 19:18:15 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:56.369 19:18:15 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:56.369 19:18:15 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:56.369 19:18:15 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:00.562 00:05:00.562 real 0m8.361s 00:05:00.562 user 0m2.652s 00:05:00.562 sys 0m5.050s 00:05:00.562 19:18:20 setup.sh.acl.denied -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:00.562 19:18:20 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:05:00.562 ************************************ 00:05:00.562 END TEST denied 00:05:00.562 ************************************ 00:05:00.562 19:18:20 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:05:00.562 19:18:20 setup.sh.acl -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:00.562 19:18:20 setup.sh.acl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:00.562 19:18:20 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:05:00.562 ************************************ 00:05:00.562 START TEST allowed 00:05:00.562 ************************************ 00:05:00.562 19:18:20 setup.sh.acl.allowed -- common/autotest_common.sh@1129 -- # allowed 00:05:00.562 19:18:20 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:05:00.562 19:18:20 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:05:00.562 19:18:20 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:05:00.562 19:18:20 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:05:00.562 19:18:20 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:05.834 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:05.834 19:18:24 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:05:05.834 19:18:24 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:05:05.834 19:18:24 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:05:05.834 19:18:24 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:05.834 19:18:24 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:09.125 00:05:09.125 real 0m8.458s 00:05:09.125 user 0m2.183s 00:05:09.125 sys 0m4.731s 00:05:09.125 19:18:28 setup.sh.acl.allowed -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:09.125 19:18:28 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:05:09.125 ************************************ 00:05:09.125 END TEST allowed 00:05:09.125 ************************************ 00:05:09.125 00:05:09.125 real 0m24.421s 00:05:09.125 user 0m7.542s 00:05:09.125 sys 0m14.924s 00:05:09.125 19:18:28 setup.sh.acl -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:09.125 19:18:28 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:05:09.125 ************************************ 00:05:09.125 END TEST acl 00:05:09.125 ************************************ 00:05:09.125 19:18:28 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:05:09.125 19:18:28 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:09.125 19:18:28 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:09.125 19:18:28 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:09.125 ************************************ 00:05:09.125 START TEST hugepages 00:05:09.125 ************************************ 00:05:09.125 19:18:28 setup.sh.hugepages -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:05:09.125 * Looking for test storage... 00:05:09.125 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:09.125 19:18:28 setup.sh.hugepages -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:05:09.125 19:18:28 setup.sh.hugepages -- common/autotest_common.sh@1693 -- # lcov --version 00:05:09.125 19:18:28 setup.sh.hugepages -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:05:09.125 19:18:28 setup.sh.hugepages -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:05:09.125 19:18:28 setup.sh.hugepages -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:09.125 19:18:28 setup.sh.hugepages -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:09.125 19:18:28 setup.sh.hugepages -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:09.125 19:18:28 setup.sh.hugepages -- scripts/common.sh@336 -- # IFS=.-: 00:05:09.125 19:18:28 setup.sh.hugepages -- scripts/common.sh@336 -- # read -ra ver1 00:05:09.125 19:18:28 setup.sh.hugepages -- scripts/common.sh@337 -- # IFS=.-: 00:05:09.125 19:18:28 setup.sh.hugepages -- scripts/common.sh@337 -- # read -ra ver2 00:05:09.125 19:18:28 setup.sh.hugepages -- scripts/common.sh@338 -- # local 'op=<' 00:05:09.125 19:18:28 setup.sh.hugepages -- scripts/common.sh@340 -- # ver1_l=2 00:05:09.125 19:18:28 setup.sh.hugepages -- scripts/common.sh@341 -- # ver2_l=1 00:05:09.125 19:18:28 setup.sh.hugepages -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:09.125 19:18:28 setup.sh.hugepages -- scripts/common.sh@344 -- # case "$op" in 00:05:09.125 19:18:28 setup.sh.hugepages -- scripts/common.sh@345 -- # : 1 00:05:09.125 19:18:28 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:09.125 19:18:28 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:09.125 19:18:29 setup.sh.hugepages -- scripts/common.sh@365 -- # decimal 1 00:05:09.125 19:18:29 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=1 00:05:09.125 19:18:29 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:09.125 19:18:29 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 1 00:05:09.126 19:18:29 setup.sh.hugepages -- scripts/common.sh@365 -- # ver1[v]=1 00:05:09.126 19:18:29 setup.sh.hugepages -- scripts/common.sh@366 -- # decimal 2 00:05:09.126 19:18:29 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=2 00:05:09.126 19:18:29 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:09.126 19:18:29 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 2 00:05:09.126 19:18:29 setup.sh.hugepages -- scripts/common.sh@366 -- # ver2[v]=2 00:05:09.126 19:18:29 setup.sh.hugepages -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:09.126 19:18:29 setup.sh.hugepages -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:09.126 19:18:29 setup.sh.hugepages -- scripts/common.sh@368 -- # return 0 00:05:09.126 19:18:29 setup.sh.hugepages -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:09.126 19:18:29 setup.sh.hugepages -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:05:09.126 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:09.126 --rc genhtml_branch_coverage=1 00:05:09.126 --rc genhtml_function_coverage=1 00:05:09.126 --rc genhtml_legend=1 00:05:09.126 --rc geninfo_all_blocks=1 00:05:09.126 --rc geninfo_unexecuted_blocks=1 00:05:09.126 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:09.126 ' 00:05:09.126 19:18:29 setup.sh.hugepages -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:05:09.126 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:09.126 --rc genhtml_branch_coverage=1 00:05:09.126 --rc genhtml_function_coverage=1 00:05:09.126 --rc genhtml_legend=1 00:05:09.126 --rc geninfo_all_blocks=1 00:05:09.126 --rc geninfo_unexecuted_blocks=1 00:05:09.126 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:09.126 ' 00:05:09.126 19:18:29 setup.sh.hugepages -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:05:09.126 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:09.126 --rc genhtml_branch_coverage=1 00:05:09.126 --rc genhtml_function_coverage=1 00:05:09.126 --rc genhtml_legend=1 00:05:09.126 --rc geninfo_all_blocks=1 00:05:09.126 --rc geninfo_unexecuted_blocks=1 00:05:09.126 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:09.126 ' 00:05:09.126 19:18:29 setup.sh.hugepages -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:05:09.126 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:09.126 --rc genhtml_branch_coverage=1 00:05:09.126 --rc genhtml_function_coverage=1 00:05:09.126 --rc genhtml_legend=1 00:05:09.126 --rc geninfo_all_blocks=1 00:05:09.126 --rc geninfo_unexecuted_blocks=1 00:05:09.126 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:09.126 ' 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 39294212 kB' 'MemAvailable: 40939536 kB' 'Buffers: 6816 kB' 'Cached: 11424064 kB' 'SwapCached: 248 kB' 'Active: 8892920 kB' 'Inactive: 3155000 kB' 'Active(anon): 7985240 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 620880 kB' 'Mapped: 165668 kB' 'Shmem: 9681976 kB' 'KReclaimable: 593220 kB' 'Slab: 1601264 kB' 'SReclaimable: 593220 kB' 'SUnreclaim: 1008044 kB' 'KernelStack: 21952 kB' 'PageTables: 9016 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433348 kB' 'Committed_AS: 12245232 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217972 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.126 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.387 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.388 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGEMEM 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGENODE 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v NRHUGE 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@197 -- # get_nodes 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@26 -- # local node 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@198 -- # clear_hp 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:05:09.389 19:18:29 setup.sh.hugepages -- setup/hugepages.sh@200 -- # run_test single_node_setup single_node_setup 00:05:09.389 19:18:29 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:09.389 19:18:29 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:09.389 19:18:29 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:09.389 ************************************ 00:05:09.389 START TEST single_node_setup 00:05:09.389 ************************************ 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1129 -- # single_node_setup 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@135 -- # get_test_nr_hugepages 2097152 0 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@48 -- # local size=2097152 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@50 -- # shift 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # node_ids=('0') 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # local node_ids 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # local user_nodes 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@72 -- # return 0 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # NRHUGE=1024 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # HUGENODE=0 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # setup output 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:05:09.389 19:18:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:12.682 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:12.682 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:12.682 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:12.682 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:12.682 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:12.682 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:12.682 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:12.682 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:12.682 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:12.682 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:12.682 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:12.682 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:12.682 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:12.682 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:12.682 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:12.682 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:14.065 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:14.065 19:18:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@137 -- # verify_nr_hugepages 00:05:14.065 19:18:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@88 -- # local node 00:05:14.065 19:18:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@89 -- # local sorted_t 00:05:14.065 19:18:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@90 -- # local sorted_s 00:05:14.065 19:18:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@91 -- # local surp 00:05:14.065 19:18:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@92 -- # local resv 00:05:14.065 19:18:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@93 -- # local anon 00:05:14.065 19:18:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:14.065 19:18:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:14.065 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:14.065 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:14.065 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:14.065 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:14.065 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.065 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:14.065 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41457632 kB' 'MemAvailable: 43102916 kB' 'Buffers: 6816 kB' 'Cached: 11424212 kB' 'SwapCached: 248 kB' 'Active: 8895352 kB' 'Inactive: 3155000 kB' 'Active(anon): 7987672 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 622404 kB' 'Mapped: 165848 kB' 'Shmem: 9682124 kB' 'KReclaimable: 593180 kB' 'Slab: 1600112 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1006932 kB' 'KernelStack: 22080 kB' 'PageTables: 9220 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12245788 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218116 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.066 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # anon=0 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41461100 kB' 'MemAvailable: 43106384 kB' 'Buffers: 6816 kB' 'Cached: 11424216 kB' 'SwapCached: 248 kB' 'Active: 8894536 kB' 'Inactive: 3155000 kB' 'Active(anon): 7986856 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 621512 kB' 'Mapped: 165768 kB' 'Shmem: 9682128 kB' 'KReclaimable: 593180 kB' 'Slab: 1600068 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1006888 kB' 'KernelStack: 21968 kB' 'PageTables: 8700 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12246056 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218116 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.067 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.068 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # surp=0 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41459872 kB' 'MemAvailable: 43105156 kB' 'Buffers: 6816 kB' 'Cached: 11424228 kB' 'SwapCached: 248 kB' 'Active: 8894260 kB' 'Inactive: 3155000 kB' 'Active(anon): 7986580 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 621204 kB' 'Mapped: 165760 kB' 'Shmem: 9682140 kB' 'KReclaimable: 593180 kB' 'Slab: 1600068 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1006888 kB' 'KernelStack: 21984 kB' 'PageTables: 9040 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12246076 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218148 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.069 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.333 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # resv=0 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:14.334 nr_hugepages=1024 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:14.334 resv_hugepages=0 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:14.334 surplus_hugepages=0 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:14.334 anon_hugepages=0 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41467428 kB' 'MemAvailable: 43112712 kB' 'Buffers: 6816 kB' 'Cached: 11424256 kB' 'SwapCached: 248 kB' 'Active: 8894800 kB' 'Inactive: 3155000 kB' 'Active(anon): 7987120 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 621716 kB' 'Mapped: 165768 kB' 'Shmem: 9682168 kB' 'KReclaimable: 593180 kB' 'Slab: 1600068 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1006888 kB' 'KernelStack: 22080 kB' 'PageTables: 9252 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12246100 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218212 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.334 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:33 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.335 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 1024 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@111 -- # get_nodes 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@26 -- # local node 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node=0 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 22485956 kB' 'MemUsed: 10148480 kB' 'SwapCached: 148 kB' 'Active: 5160812 kB' 'Inactive: 535724 kB' 'Active(anon): 4383048 kB' 'Inactive(anon): 520 kB' 'Active(file): 777764 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5418316 kB' 'Mapped: 85956 kB' 'AnonPages: 281360 kB' 'Shmem: 4105200 kB' 'KernelStack: 10120 kB' 'PageTables: 4424 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 403352 kB' 'Slab: 889064 kB' 'SReclaimable: 403352 kB' 'SUnreclaim: 485712 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.336 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.337 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:14.338 node0=1024 expecting 1024 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:14.338 00:05:14.338 real 0m4.936s 00:05:14.338 user 0m1.188s 00:05:14.338 sys 0m2.226s 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:14.338 19:18:34 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@10 -- # set +x 00:05:14.338 ************************************ 00:05:14.338 END TEST single_node_setup 00:05:14.338 ************************************ 00:05:14.338 19:18:34 setup.sh.hugepages -- setup/hugepages.sh@201 -- # run_test even_2G_alloc even_2G_alloc 00:05:14.338 19:18:34 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:14.338 19:18:34 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:14.338 19:18:34 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:14.338 ************************************ 00:05:14.338 START TEST even_2G_alloc 00:05:14.338 ************************************ 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1129 -- # even_2G_alloc 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@142 -- # get_test_nr_hugepages 2097152 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 512 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # NRHUGE=1024 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # setup output 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:14.338 19:18:34 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:17.629 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:17.629 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:17.629 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:17.629 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:17.629 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:17.629 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:17.629 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:17.629 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:17.629 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:17.629 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:17.629 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:17.629 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:17.629 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:17.629 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:17.629 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:17.629 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:17.629 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:17.629 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@144 -- # verify_nr_hugepages 00:05:17.629 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@88 -- # local node 00:05:17.629 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:17.629 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:17.629 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:17.629 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:17.629 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:17.629 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:17.630 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:17.630 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:17.630 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:17.630 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:17.630 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.630 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.630 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:17.630 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:17.630 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.630 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.630 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.630 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.630 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41467104 kB' 'MemAvailable: 43112388 kB' 'Buffers: 6816 kB' 'Cached: 11424376 kB' 'SwapCached: 248 kB' 'Active: 8894696 kB' 'Inactive: 3155000 kB' 'Active(anon): 7987016 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 621492 kB' 'Mapped: 164728 kB' 'Shmem: 9682288 kB' 'KReclaimable: 593180 kB' 'Slab: 1600320 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1007140 kB' 'KernelStack: 21856 kB' 'PageTables: 8484 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12236836 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218228 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:17.630 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.630 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.630 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.630 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.630 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.630 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.630 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.630 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.895 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.896 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41468364 kB' 'MemAvailable: 43113648 kB' 'Buffers: 6816 kB' 'Cached: 11424380 kB' 'SwapCached: 248 kB' 'Active: 8894644 kB' 'Inactive: 3155000 kB' 'Active(anon): 7986964 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 621420 kB' 'Mapped: 164612 kB' 'Shmem: 9682292 kB' 'KReclaimable: 593180 kB' 'Slab: 1600276 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1007096 kB' 'KernelStack: 21904 kB' 'PageTables: 8708 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12236852 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218180 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.897 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41467608 kB' 'MemAvailable: 43112892 kB' 'Buffers: 6816 kB' 'Cached: 11424400 kB' 'SwapCached: 248 kB' 'Active: 8894720 kB' 'Inactive: 3155000 kB' 'Active(anon): 7987040 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 621424 kB' 'Mapped: 164612 kB' 'Shmem: 9682312 kB' 'KReclaimable: 593180 kB' 'Slab: 1600276 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1007096 kB' 'KernelStack: 21904 kB' 'PageTables: 8708 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12236876 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218180 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.898 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.899 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:17.900 nr_hugepages=1024 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:17.900 resv_hugepages=0 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:17.900 surplus_hugepages=0 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:17.900 anon_hugepages=0 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.900 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41468112 kB' 'MemAvailable: 43113396 kB' 'Buffers: 6816 kB' 'Cached: 11424400 kB' 'SwapCached: 248 kB' 'Active: 8894720 kB' 'Inactive: 3155000 kB' 'Active(anon): 7987040 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 621424 kB' 'Mapped: 164612 kB' 'Shmem: 9682312 kB' 'KReclaimable: 593180 kB' 'Slab: 1600276 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1007096 kB' 'KernelStack: 21904 kB' 'PageTables: 8708 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12236896 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218180 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.901 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@26 -- # local node 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23540520 kB' 'MemUsed: 9093916 kB' 'SwapCached: 148 kB' 'Active: 5162292 kB' 'Inactive: 535724 kB' 'Active(anon): 4384528 kB' 'Inactive(anon): 520 kB' 'Active(file): 777764 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5418420 kB' 'Mapped: 85140 kB' 'AnonPages: 282712 kB' 'Shmem: 4105304 kB' 'KernelStack: 10120 kB' 'PageTables: 4360 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 403352 kB' 'Slab: 889332 kB' 'SReclaimable: 403352 kB' 'SUnreclaim: 485980 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.902 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.903 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 17927484 kB' 'MemUsed: 9721876 kB' 'SwapCached: 100 kB' 'Active: 3732160 kB' 'Inactive: 2619276 kB' 'Active(anon): 3602244 kB' 'Inactive(anon): 2313256 kB' 'Active(file): 129916 kB' 'Inactive(file): 306020 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 6013048 kB' 'Mapped: 79472 kB' 'AnonPages: 338440 kB' 'Shmem: 5577012 kB' 'KernelStack: 11752 kB' 'PageTables: 4252 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 189828 kB' 'Slab: 710944 kB' 'SReclaimable: 189828 kB' 'SUnreclaim: 521116 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.904 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:05:17.905 node0=512 expecting 512 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:05:17.905 node1=512 expecting 512 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@129 -- # [[ 512 == \5\1\2 ]] 00:05:17.905 00:05:17.905 real 0m3.566s 00:05:17.905 user 0m1.302s 00:05:17.905 sys 0m2.328s 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:17.905 19:18:37 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:17.905 ************************************ 00:05:17.905 END TEST even_2G_alloc 00:05:17.905 ************************************ 00:05:17.905 19:18:37 setup.sh.hugepages -- setup/hugepages.sh@202 -- # run_test odd_alloc odd_alloc 00:05:17.905 19:18:37 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:17.905 19:18:37 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:17.905 19:18:37 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:17.905 ************************************ 00:05:17.905 START TEST odd_alloc 00:05:17.905 ************************************ 00:05:17.905 19:18:37 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1129 -- # odd_alloc 00:05:17.905 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@149 -- # get_test_nr_hugepages 2098176 00:05:17.905 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@48 -- # local size=2098176 00:05:17.905 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:17.905 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:17.905 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1025 00:05:17.905 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:17.905 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:17.905 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:17.905 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1025 00:05:17.905 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:17.905 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:17.905 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:17.905 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:17.905 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:17.905 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:17.905 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:17.905 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 513 00:05:17.905 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:17.906 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:17.906 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=513 00:05:17.906 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:17.906 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:17.906 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:17.906 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # HUGEMEM=2049 00:05:17.906 19:18:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # setup output 00:05:17.906 19:18:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:17.906 19:18:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:21.199 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:21.199 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:21.199 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:21.199 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:21.199 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:21.461 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:21.461 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:21.461 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:21.461 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:21.461 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:21.461 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:21.461 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:21.461 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:21.461 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:21.461 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:21.461 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:21.461 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@151 -- # verify_nr_hugepages 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@88 -- # local node 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41476472 kB' 'MemAvailable: 43121756 kB' 'Buffers: 6816 kB' 'Cached: 11424548 kB' 'SwapCached: 248 kB' 'Active: 8893796 kB' 'Inactive: 3155000 kB' 'Active(anon): 7986116 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 620400 kB' 'Mapped: 164704 kB' 'Shmem: 9682460 kB' 'KReclaimable: 593180 kB' 'Slab: 1600164 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1006984 kB' 'KernelStack: 21952 kB' 'PageTables: 8872 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 12237528 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218212 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.461 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.462 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41478948 kB' 'MemAvailable: 43124232 kB' 'Buffers: 6816 kB' 'Cached: 11424548 kB' 'SwapCached: 248 kB' 'Active: 8894076 kB' 'Inactive: 3155000 kB' 'Active(anon): 7986396 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 620640 kB' 'Mapped: 164620 kB' 'Shmem: 9682460 kB' 'KReclaimable: 593180 kB' 'Slab: 1600116 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1006936 kB' 'KernelStack: 21888 kB' 'PageTables: 8672 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 12237536 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218164 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.463 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.464 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.465 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41478948 kB' 'MemAvailable: 43124232 kB' 'Buffers: 6816 kB' 'Cached: 11424568 kB' 'SwapCached: 248 kB' 'Active: 8893960 kB' 'Inactive: 3155000 kB' 'Active(anon): 7986280 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 620576 kB' 'Mapped: 164620 kB' 'Shmem: 9682480 kB' 'KReclaimable: 593180 kB' 'Slab: 1600116 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1006936 kB' 'KernelStack: 21904 kB' 'PageTables: 8720 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 12237556 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218164 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.466 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.729 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1025 00:05:21.730 nr_hugepages=1025 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:21.730 resv_hugepages=0 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:21.730 surplus_hugepages=0 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:21.730 anon_hugepages=0 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@106 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@108 -- # (( 1025 == nr_hugepages )) 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41484176 kB' 'MemAvailable: 43129460 kB' 'Buffers: 6816 kB' 'Cached: 11424588 kB' 'SwapCached: 248 kB' 'Active: 8894028 kB' 'Inactive: 3155000 kB' 'Active(anon): 7986348 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 620572 kB' 'Mapped: 164620 kB' 'Shmem: 9682500 kB' 'KReclaimable: 593180 kB' 'Slab: 1600100 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1006920 kB' 'KernelStack: 21904 kB' 'PageTables: 8720 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 12237576 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218164 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.730 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.731 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@26 -- # local node 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=513 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23551824 kB' 'MemUsed: 9082612 kB' 'SwapCached: 148 kB' 'Active: 5161116 kB' 'Inactive: 535724 kB' 'Active(anon): 4383352 kB' 'Inactive(anon): 520 kB' 'Active(file): 777764 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5418584 kB' 'Mapped: 85148 kB' 'AnonPages: 281388 kB' 'Shmem: 4105468 kB' 'KernelStack: 10168 kB' 'PageTables: 4608 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 403352 kB' 'Slab: 889328 kB' 'SReclaimable: 403352 kB' 'SUnreclaim: 485976 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.732 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.733 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 17932352 kB' 'MemUsed: 9717008 kB' 'SwapCached: 100 kB' 'Active: 3732936 kB' 'Inactive: 2619276 kB' 'Active(anon): 3603020 kB' 'Inactive(anon): 2313256 kB' 'Active(file): 129916 kB' 'Inactive(file): 306020 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 6013088 kB' 'Mapped: 79472 kB' 'AnonPages: 339184 kB' 'Shmem: 5577052 kB' 'KernelStack: 11736 kB' 'PageTables: 4112 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 189828 kB' 'Slab: 710772 kB' 'SReclaimable: 189828 kB' 'SUnreclaim: 520944 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.734 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node0=513 expecting 513' 00:05:21.735 node0=513 expecting 513 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:05:21.735 node1=512 expecting 512 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@129 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:05:21.735 00:05:21.735 real 0m3.674s 00:05:21.735 user 0m1.343s 00:05:21.735 sys 0m2.397s 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:21.735 19:18:41 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:21.735 ************************************ 00:05:21.735 END TEST odd_alloc 00:05:21.735 ************************************ 00:05:21.735 19:18:41 setup.sh.hugepages -- setup/hugepages.sh@203 -- # run_test custom_alloc custom_alloc 00:05:21.735 19:18:41 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:21.735 19:18:41 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:21.735 19:18:41 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:21.735 ************************************ 00:05:21.735 START TEST custom_alloc 00:05:21.735 ************************************ 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1129 -- # custom_alloc 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@157 -- # local IFS=, 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@159 -- # local node 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # nodes_hp=() 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # local nodes_hp 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@162 -- # local nr_hugepages=0 _nr_hugepages=0 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@164 -- # get_test_nr_hugepages 1048576 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=1048576 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=512 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=512 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 256 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@165 -- # nodes_hp[0]=512 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@166 -- # (( 2 > 1 )) 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # get_test_nr_hugepages 2097152 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 1 > 0 )) 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@168 -- # nodes_hp[1]=1024 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # get_test_nr_hugepages_per_node 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:21.735 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:21.736 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 2 > 0 )) 00:05:21.736 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:21.736 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:21.736 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:21.736 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=1024 00:05:21.736 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:21.736 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:05:21.736 19:18:41 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # setup output 00:05:21.736 19:18:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:21.736 19:18:41 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:25.028 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:25.028 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:25.028 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:25.028 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:25.028 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:25.028 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:25.028 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:25.028 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:25.028 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:25.028 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:25.028 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:25.028 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:25.028 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:25.028 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:25.028 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:25.028 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:25.292 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:25.292 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nr_hugepages=1536 00:05:25.292 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # verify_nr_hugepages 00:05:25.292 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@88 -- # local node 00:05:25.292 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:25.292 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:25.292 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:25.292 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:25.292 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:25.292 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:25.292 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:25.292 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:25.292 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:25.292 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:25.292 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.292 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.292 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:25.292 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:25.292 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.292 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.292 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.292 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40423392 kB' 'MemAvailable: 42068676 kB' 'Buffers: 6816 kB' 'Cached: 11424728 kB' 'SwapCached: 248 kB' 'Active: 8895756 kB' 'Inactive: 3155000 kB' 'Active(anon): 7988076 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 622144 kB' 'Mapped: 164684 kB' 'Shmem: 9682640 kB' 'KReclaimable: 593180 kB' 'Slab: 1600072 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1006892 kB' 'KernelStack: 21968 kB' 'PageTables: 8852 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 12239728 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218244 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.293 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40424620 kB' 'MemAvailable: 42069904 kB' 'Buffers: 6816 kB' 'Cached: 11424732 kB' 'SwapCached: 248 kB' 'Active: 8895396 kB' 'Inactive: 3155000 kB' 'Active(anon): 7987716 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 621860 kB' 'Mapped: 164652 kB' 'Shmem: 9682644 kB' 'KReclaimable: 593180 kB' 'Slab: 1600080 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1006900 kB' 'KernelStack: 22000 kB' 'PageTables: 8712 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 12241012 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218180 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.294 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.295 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40422956 kB' 'MemAvailable: 42068240 kB' 'Buffers: 6816 kB' 'Cached: 11424752 kB' 'SwapCached: 248 kB' 'Active: 8895396 kB' 'Inactive: 3155000 kB' 'Active(anon): 7987716 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 621800 kB' 'Mapped: 164652 kB' 'Shmem: 9682664 kB' 'KReclaimable: 593180 kB' 'Slab: 1599984 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1006804 kB' 'KernelStack: 22032 kB' 'PageTables: 8860 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 12241032 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218212 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.296 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.297 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1536 00:05:25.298 nr_hugepages=1536 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:25.298 resv_hugepages=0 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:25.298 surplus_hugepages=0 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:25.298 anon_hugepages=0 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@106 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@108 -- # (( 1536 == nr_hugepages )) 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40421564 kB' 'MemAvailable: 42066848 kB' 'Buffers: 6816 kB' 'Cached: 11424772 kB' 'SwapCached: 248 kB' 'Active: 8895236 kB' 'Inactive: 3155000 kB' 'Active(anon): 7987556 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 621620 kB' 'Mapped: 164652 kB' 'Shmem: 9682684 kB' 'KReclaimable: 593180 kB' 'Slab: 1599984 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1006804 kB' 'KernelStack: 21920 kB' 'PageTables: 8860 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 12240808 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218228 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.298 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.299 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@26 -- # local node 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23532640 kB' 'MemUsed: 9101796 kB' 'SwapCached: 148 kB' 'Active: 5163068 kB' 'Inactive: 535724 kB' 'Active(anon): 4385304 kB' 'Inactive(anon): 520 kB' 'Active(file): 777764 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5418692 kB' 'Mapped: 85176 kB' 'AnonPages: 283200 kB' 'Shmem: 4105576 kB' 'KernelStack: 10312 kB' 'PageTables: 4800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 403352 kB' 'Slab: 889308 kB' 'SReclaimable: 403352 kB' 'SUnreclaim: 485956 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.300 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.561 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 16887796 kB' 'MemUsed: 10761564 kB' 'SwapCached: 100 kB' 'Active: 3731944 kB' 'Inactive: 2619276 kB' 'Active(anon): 3602028 kB' 'Inactive(anon): 2313256 kB' 'Active(file): 129916 kB' 'Inactive(file): 306020 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 6013184 kB' 'Mapped: 79476 kB' 'AnonPages: 338100 kB' 'Shmem: 5577148 kB' 'KernelStack: 11704 kB' 'PageTables: 4112 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 189828 kB' 'Slab: 710676 kB' 'SReclaimable: 189828 kB' 'SUnreclaim: 520848 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.562 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:05:25.563 node0=512 expecting 512 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node1=1024 expecting 1024' 00:05:25.563 node1=1024 expecting 1024 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@129 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:05:25.563 00:05:25.563 real 0m3.696s 00:05:25.563 user 0m1.427s 00:05:25.563 sys 0m2.325s 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:25.563 19:18:45 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:25.563 ************************************ 00:05:25.563 END TEST custom_alloc 00:05:25.563 ************************************ 00:05:25.563 19:18:45 setup.sh.hugepages -- setup/hugepages.sh@204 -- # run_test no_shrink_alloc no_shrink_alloc 00:05:25.563 19:18:45 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:25.563 19:18:45 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:25.563 19:18:45 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:25.563 ************************************ 00:05:25.563 START TEST no_shrink_alloc 00:05:25.563 ************************************ 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1129 -- # no_shrink_alloc 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@185 -- # get_test_nr_hugepages 2097152 0 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # shift 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # node_ids=('0') 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # local node_ids 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@72 -- # return 0 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # NRHUGE=1024 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # HUGENODE=0 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # setup output 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:25.563 19:18:45 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:28.856 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:28.856 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:28.856 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:28.856 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:28.856 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:28.856 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:28.856 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:28.856 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:28.856 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:28.856 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:28.856 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:28.856 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:28.856 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:28.856 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:28.856 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:28.856 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:28.856 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:28.856 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@189 -- # verify_nr_hugepages 00:05:28.856 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:28.856 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:28.856 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:28.856 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:28.856 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:28.856 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41438440 kB' 'MemAvailable: 43083724 kB' 'Buffers: 6816 kB' 'Cached: 11424904 kB' 'SwapCached: 248 kB' 'Active: 8897364 kB' 'Inactive: 3155000 kB' 'Active(anon): 7989684 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 623152 kB' 'Mapped: 164756 kB' 'Shmem: 9682816 kB' 'KReclaimable: 593180 kB' 'Slab: 1599700 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1006520 kB' 'KernelStack: 21904 kB' 'PageTables: 8748 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12239040 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218052 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.857 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.122 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.122 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.122 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.122 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.123 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41439904 kB' 'MemAvailable: 43085188 kB' 'Buffers: 6816 kB' 'Cached: 11424908 kB' 'SwapCached: 248 kB' 'Active: 8896196 kB' 'Inactive: 3155000 kB' 'Active(anon): 7988516 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 622476 kB' 'Mapped: 164648 kB' 'Shmem: 9682820 kB' 'KReclaimable: 593180 kB' 'Slab: 1599660 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1006480 kB' 'KernelStack: 21904 kB' 'PageTables: 8716 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12239060 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218036 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.124 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.125 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41439360 kB' 'MemAvailable: 43084644 kB' 'Buffers: 6816 kB' 'Cached: 11424924 kB' 'SwapCached: 248 kB' 'Active: 8896204 kB' 'Inactive: 3155000 kB' 'Active(anon): 7988524 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 622476 kB' 'Mapped: 164648 kB' 'Shmem: 9682836 kB' 'KReclaimable: 593180 kB' 'Slab: 1599660 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1006480 kB' 'KernelStack: 21904 kB' 'PageTables: 8716 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12239080 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218036 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.126 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:29.127 nr_hugepages=1024 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:29.127 resv_hugepages=0 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:29.127 surplus_hugepages=0 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:29.127 anon_hugepages=0 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:29.127 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41439360 kB' 'MemAvailable: 43084644 kB' 'Buffers: 6816 kB' 'Cached: 11424948 kB' 'SwapCached: 248 kB' 'Active: 8895824 kB' 'Inactive: 3155000 kB' 'Active(anon): 7988144 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 622024 kB' 'Mapped: 164648 kB' 'Shmem: 9682860 kB' 'KReclaimable: 593180 kB' 'Slab: 1599660 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1006480 kB' 'KernelStack: 21856 kB' 'PageTables: 8552 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12239104 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218004 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.128 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.129 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 22486380 kB' 'MemUsed: 10148056 kB' 'SwapCached: 148 kB' 'Active: 5162988 kB' 'Inactive: 535724 kB' 'Active(anon): 4385224 kB' 'Inactive(anon): 520 kB' 'Active(file): 777764 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5418728 kB' 'Mapped: 85168 kB' 'AnonPages: 283104 kB' 'Shmem: 4105612 kB' 'KernelStack: 10168 kB' 'PageTables: 4548 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 403352 kB' 'Slab: 889280 kB' 'SReclaimable: 403352 kB' 'SUnreclaim: 485928 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.130 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:29.131 node0=1024 expecting 1024 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # CLEAR_HUGE=no 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # NRHUGE=512 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # HUGENODE=0 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # setup output 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:29.131 19:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:32.425 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:32.425 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:32.425 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:32.425 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:32.425 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:32.425 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:32.425 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:32.425 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:32.425 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:32.425 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:32.425 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:32.425 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:32.425 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:32.425 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:32.425 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:32.425 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:32.425 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:32.425 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@194 -- # verify_nr_hugepages 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41436440 kB' 'MemAvailable: 43081724 kB' 'Buffers: 6816 kB' 'Cached: 11425056 kB' 'SwapCached: 248 kB' 'Active: 8897940 kB' 'Inactive: 3155000 kB' 'Active(anon): 7990260 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 624096 kB' 'Mapped: 164772 kB' 'Shmem: 9682968 kB' 'KReclaimable: 593180 kB' 'Slab: 1599272 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1006092 kB' 'KernelStack: 21904 kB' 'PageTables: 8772 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12239712 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218116 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.425 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.426 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41436852 kB' 'MemAvailable: 43082136 kB' 'Buffers: 6816 kB' 'Cached: 11425060 kB' 'SwapCached: 248 kB' 'Active: 8897676 kB' 'Inactive: 3155000 kB' 'Active(anon): 7989996 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 623820 kB' 'Mapped: 164668 kB' 'Shmem: 9682972 kB' 'KReclaimable: 593180 kB' 'Slab: 1599260 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1006080 kB' 'KernelStack: 21888 kB' 'PageTables: 8704 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12239732 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218084 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.427 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.691 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41436852 kB' 'MemAvailable: 43082136 kB' 'Buffers: 6816 kB' 'Cached: 11425076 kB' 'SwapCached: 248 kB' 'Active: 8897692 kB' 'Inactive: 3155000 kB' 'Active(anon): 7990012 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 623812 kB' 'Mapped: 164668 kB' 'Shmem: 9682988 kB' 'KReclaimable: 593180 kB' 'Slab: 1599260 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1006080 kB' 'KernelStack: 21888 kB' 'PageTables: 8704 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12239752 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218084 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.692 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.693 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:32.694 nr_hugepages=1024 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:32.694 resv_hugepages=0 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:32.694 surplus_hugepages=0 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:32.694 anon_hugepages=0 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41437344 kB' 'MemAvailable: 43082628 kB' 'Buffers: 6816 kB' 'Cached: 11425100 kB' 'SwapCached: 248 kB' 'Active: 8897760 kB' 'Inactive: 3155000 kB' 'Active(anon): 7990080 kB' 'Inactive(anon): 2313776 kB' 'Active(file): 907680 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699196 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 623820 kB' 'Mapped: 164668 kB' 'Shmem: 9683012 kB' 'KReclaimable: 593180 kB' 'Slab: 1599260 kB' 'SReclaimable: 593180 kB' 'SUnreclaim: 1006080 kB' 'KernelStack: 21888 kB' 'PageTables: 8704 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12239776 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218084 kB' 'VmallocChunk: 0 kB' 'Percpu: 118720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.694 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.695 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 22492792 kB' 'MemUsed: 10141644 kB' 'SwapCached: 148 kB' 'Active: 5162824 kB' 'Inactive: 535724 kB' 'Active(anon): 4385060 kB' 'Inactive(anon): 520 kB' 'Active(file): 777764 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5418740 kB' 'Mapped: 85184 kB' 'AnonPages: 282936 kB' 'Shmem: 4105624 kB' 'KernelStack: 10152 kB' 'PageTables: 4500 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 403352 kB' 'Slab: 888816 kB' 'SReclaimable: 403352 kB' 'SUnreclaim: 485464 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.696 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:32.697 node0=1024 expecting 1024 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:32.697 00:05:32.697 real 0m7.135s 00:05:32.697 user 0m2.539s 00:05:32.697 sys 0m4.694s 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:32.697 19:18:52 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:32.697 ************************************ 00:05:32.697 END TEST no_shrink_alloc 00:05:32.697 ************************************ 00:05:32.697 19:18:52 setup.sh.hugepages -- setup/hugepages.sh@206 -- # clear_hp 00:05:32.697 19:18:52 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:05:32.697 19:18:52 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:32.697 19:18:52 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:32.697 19:18:52 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:32.697 19:18:52 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:32.697 19:18:52 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:32.697 19:18:52 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:32.697 19:18:52 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:32.697 19:18:52 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:32.697 19:18:52 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:32.697 19:18:52 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:32.697 19:18:52 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:05:32.697 19:18:52 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:05:32.697 00:05:32.697 real 0m23.693s 00:05:32.697 user 0m8.101s 00:05:32.697 sys 0m14.409s 00:05:32.697 19:18:52 setup.sh.hugepages -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:32.697 19:18:52 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:32.697 ************************************ 00:05:32.697 END TEST hugepages 00:05:32.697 ************************************ 00:05:32.697 19:18:52 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:32.697 19:18:52 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:32.697 19:18:52 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:32.697 19:18:52 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:32.697 ************************************ 00:05:32.697 START TEST driver 00:05:32.697 ************************************ 00:05:32.697 19:18:52 setup.sh.driver -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:32.957 * Looking for test storage... 00:05:32.957 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:32.957 19:18:52 setup.sh.driver -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:05:32.957 19:18:52 setup.sh.driver -- common/autotest_common.sh@1693 -- # lcov --version 00:05:32.957 19:18:52 setup.sh.driver -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:05:32.957 19:18:52 setup.sh.driver -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@336 -- # IFS=.-: 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@336 -- # read -ra ver1 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@337 -- # IFS=.-: 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@337 -- # read -ra ver2 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@338 -- # local 'op=<' 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@340 -- # ver1_l=2 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@341 -- # ver2_l=1 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@344 -- # case "$op" in 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@345 -- # : 1 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@365 -- # decimal 1 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@353 -- # local d=1 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@355 -- # echo 1 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@365 -- # ver1[v]=1 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@366 -- # decimal 2 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@353 -- # local d=2 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@355 -- # echo 2 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@366 -- # ver2[v]=2 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:32.957 19:18:52 setup.sh.driver -- scripts/common.sh@368 -- # return 0 00:05:32.957 19:18:52 setup.sh.driver -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:32.957 19:18:52 setup.sh.driver -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:05:32.957 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:32.957 --rc genhtml_branch_coverage=1 00:05:32.957 --rc genhtml_function_coverage=1 00:05:32.957 --rc genhtml_legend=1 00:05:32.957 --rc geninfo_all_blocks=1 00:05:32.957 --rc geninfo_unexecuted_blocks=1 00:05:32.957 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:32.957 ' 00:05:32.957 19:18:52 setup.sh.driver -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:05:32.957 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:32.957 --rc genhtml_branch_coverage=1 00:05:32.957 --rc genhtml_function_coverage=1 00:05:32.957 --rc genhtml_legend=1 00:05:32.957 --rc geninfo_all_blocks=1 00:05:32.957 --rc geninfo_unexecuted_blocks=1 00:05:32.957 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:32.957 ' 00:05:32.957 19:18:52 setup.sh.driver -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:05:32.957 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:32.957 --rc genhtml_branch_coverage=1 00:05:32.957 --rc genhtml_function_coverage=1 00:05:32.957 --rc genhtml_legend=1 00:05:32.957 --rc geninfo_all_blocks=1 00:05:32.957 --rc geninfo_unexecuted_blocks=1 00:05:32.957 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:32.957 ' 00:05:32.957 19:18:52 setup.sh.driver -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:05:32.957 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:32.957 --rc genhtml_branch_coverage=1 00:05:32.957 --rc genhtml_function_coverage=1 00:05:32.957 --rc genhtml_legend=1 00:05:32.957 --rc geninfo_all_blocks=1 00:05:32.957 --rc geninfo_unexecuted_blocks=1 00:05:32.957 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:32.957 ' 00:05:32.957 19:18:52 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:05:32.957 19:18:52 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:32.957 19:18:52 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:37.151 19:18:56 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:37.151 19:18:56 setup.sh.driver -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:37.151 19:18:56 setup.sh.driver -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:37.151 19:18:56 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:37.151 ************************************ 00:05:37.151 START TEST guess_driver 00:05:37.151 ************************************ 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- common/autotest_common.sh@1129 -- # guess_driver 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:37.151 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:37.151 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:37.151 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:37.151 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:37.151 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:37.151 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:37.151 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:37.151 Looking for driver=vfio-pci 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:05:37.151 19:18:57 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:40.512 19:19:00 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:42.480 19:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:42.480 19:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:42.480 19:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:42.480 19:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:42.480 19:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:05:42.480 19:19:01 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:42.480 19:19:01 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:47.759 00:05:47.759 real 0m9.771s 00:05:47.759 user 0m2.496s 00:05:47.759 sys 0m4.958s 00:05:47.759 19:19:06 setup.sh.driver.guess_driver -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:47.759 19:19:06 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:47.759 ************************************ 00:05:47.759 END TEST guess_driver 00:05:47.759 ************************************ 00:05:47.759 00:05:47.759 real 0m14.238s 00:05:47.759 user 0m3.699s 00:05:47.759 sys 0m7.385s 00:05:47.759 19:19:06 setup.sh.driver -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:47.759 19:19:06 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:47.759 ************************************ 00:05:47.759 END TEST driver 00:05:47.759 ************************************ 00:05:47.759 19:19:06 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:47.759 19:19:06 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:47.759 19:19:06 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:47.759 19:19:06 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:47.759 ************************************ 00:05:47.759 START TEST devices 00:05:47.759 ************************************ 00:05:47.759 19:19:06 setup.sh.devices -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:47.759 * Looking for test storage... 00:05:47.759 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:47.759 19:19:07 setup.sh.devices -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:05:47.759 19:19:07 setup.sh.devices -- common/autotest_common.sh@1693 -- # lcov --version 00:05:47.759 19:19:07 setup.sh.devices -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:05:47.759 19:19:07 setup.sh.devices -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@336 -- # IFS=.-: 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@336 -- # read -ra ver1 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@337 -- # IFS=.-: 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@337 -- # read -ra ver2 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@338 -- # local 'op=<' 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@340 -- # ver1_l=2 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@341 -- # ver2_l=1 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@344 -- # case "$op" in 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@345 -- # : 1 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@365 -- # decimal 1 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@353 -- # local d=1 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@355 -- # echo 1 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@365 -- # ver1[v]=1 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@366 -- # decimal 2 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@353 -- # local d=2 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@355 -- # echo 2 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@366 -- # ver2[v]=2 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:47.759 19:19:07 setup.sh.devices -- scripts/common.sh@368 -- # return 0 00:05:47.759 19:19:07 setup.sh.devices -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:47.759 19:19:07 setup.sh.devices -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:05:47.759 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:47.759 --rc genhtml_branch_coverage=1 00:05:47.759 --rc genhtml_function_coverage=1 00:05:47.759 --rc genhtml_legend=1 00:05:47.759 --rc geninfo_all_blocks=1 00:05:47.759 --rc geninfo_unexecuted_blocks=1 00:05:47.759 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:47.759 ' 00:05:47.759 19:19:07 setup.sh.devices -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:05:47.759 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:47.759 --rc genhtml_branch_coverage=1 00:05:47.759 --rc genhtml_function_coverage=1 00:05:47.759 --rc genhtml_legend=1 00:05:47.759 --rc geninfo_all_blocks=1 00:05:47.759 --rc geninfo_unexecuted_blocks=1 00:05:47.759 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:47.759 ' 00:05:47.759 19:19:07 setup.sh.devices -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:05:47.759 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:47.759 --rc genhtml_branch_coverage=1 00:05:47.759 --rc genhtml_function_coverage=1 00:05:47.759 --rc genhtml_legend=1 00:05:47.759 --rc geninfo_all_blocks=1 00:05:47.759 --rc geninfo_unexecuted_blocks=1 00:05:47.759 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:47.759 ' 00:05:47.759 19:19:07 setup.sh.devices -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:05:47.759 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:47.759 --rc genhtml_branch_coverage=1 00:05:47.759 --rc genhtml_function_coverage=1 00:05:47.759 --rc genhtml_legend=1 00:05:47.759 --rc geninfo_all_blocks=1 00:05:47.759 --rc geninfo_unexecuted_blocks=1 00:05:47.759 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:47.759 ' 00:05:47.759 19:19:07 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:47.759 19:19:07 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:47.759 19:19:07 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:47.760 19:19:07 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:51.051 19:19:10 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:51.051 19:19:10 setup.sh.devices -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:05:51.051 19:19:10 setup.sh.devices -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:05:51.051 19:19:10 setup.sh.devices -- common/autotest_common.sh@1658 -- # local nvme bdf 00:05:51.051 19:19:10 setup.sh.devices -- common/autotest_common.sh@1660 -- # for nvme in /sys/block/nvme* 00:05:51.051 19:19:10 setup.sh.devices -- common/autotest_common.sh@1661 -- # is_block_zoned nvme0n1 00:05:51.051 19:19:10 setup.sh.devices -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:05:51.051 19:19:10 setup.sh.devices -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:51.051 19:19:10 setup.sh.devices -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:05:51.051 19:19:10 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:51.051 19:19:10 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:51.051 19:19:10 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:51.051 19:19:10 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:51.051 19:19:10 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:51.051 19:19:10 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:51.051 19:19:10 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:51.051 19:19:10 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:51.051 19:19:10 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:05:51.051 19:19:10 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:05:51.051 19:19:10 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:51.051 19:19:10 setup.sh.devices -- scripts/common.sh@381 -- # local block=nvme0n1 pt 00:05:51.051 19:19:10 setup.sh.devices -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:51.051 No valid GPT data, bailing 00:05:51.051 19:19:10 setup.sh.devices -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:51.051 19:19:10 setup.sh.devices -- scripts/common.sh@394 -- # pt= 00:05:51.051 19:19:10 setup.sh.devices -- scripts/common.sh@395 -- # return 1 00:05:51.051 19:19:10 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:51.051 19:19:10 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:51.051 19:19:10 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:51.051 19:19:10 setup.sh.devices -- setup/common.sh@80 -- # echo 1600321314816 00:05:51.051 19:19:10 setup.sh.devices -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:05:51.051 19:19:10 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:51.051 19:19:10 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:05:51.051 19:19:10 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:51.051 19:19:10 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:51.051 19:19:10 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:51.051 19:19:10 setup.sh.devices -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:51.051 19:19:10 setup.sh.devices -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:51.051 19:19:10 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:51.311 ************************************ 00:05:51.311 START TEST nvme_mount 00:05:51.311 ************************************ 00:05:51.311 19:19:10 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1129 -- # nvme_mount 00:05:51.311 19:19:10 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:51.311 19:19:10 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:51.311 19:19:10 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:51.311 19:19:10 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:51.311 19:19:10 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:51.311 19:19:10 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:51.311 19:19:10 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:51.311 19:19:10 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:51.311 19:19:10 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:51.311 19:19:10 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:51.311 19:19:10 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:51.311 19:19:10 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:51.311 19:19:10 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:51.311 19:19:10 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:51.311 19:19:10 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:51.311 19:19:10 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:51.311 19:19:10 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:51.311 19:19:10 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:51.311 19:19:10 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:52.248 Creating new GPT entries in memory. 00:05:52.248 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:52.248 other utilities. 00:05:52.248 19:19:11 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:52.248 19:19:12 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:52.248 19:19:12 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:52.248 19:19:12 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:52.248 19:19:12 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:53.187 Creating new GPT entries in memory. 00:05:53.187 The operation has completed successfully. 00:05:53.187 19:19:13 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:53.187 19:19:13 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:53.187 19:19:13 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 1715823 00:05:53.446 19:19:13 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:53.446 19:19:13 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:53.446 19:19:13 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:53.446 19:19:13 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:53.446 19:19:13 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:53.446 19:19:13 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:53.446 19:19:13 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:53.446 19:19:13 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:53.446 19:19:13 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:53.446 19:19:13 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:53.446 19:19:13 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:53.446 19:19:13 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:53.446 19:19:13 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:53.446 19:19:13 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:53.446 19:19:13 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:53.446 19:19:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.447 19:19:13 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:53.447 19:19:13 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:53.447 19:19:13 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:53.447 19:19:13 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:56.738 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:56.738 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:56.997 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:56.998 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:56.998 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:56.998 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:56.998 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:56.998 19:19:16 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:56.998 19:19:16 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:56.998 19:19:16 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:56.998 19:19:16 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:56.998 19:19:16 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:56.998 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:56.998 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:56.998 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:56.998 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:56.998 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:56.998 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:56.998 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:56.998 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:56.998 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:56.998 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.998 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:56.998 19:19:16 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:56.998 19:19:16 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:56.998 19:19:16 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:00.288 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.546 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:00.546 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:06:00.546 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:00.546 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:00.546 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:00.546 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:00.546 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:06:00.546 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:00.546 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:06:00.546 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:06:00.546 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:06:00.546 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:00.546 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:00.546 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:00.546 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.546 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:00.546 19:19:20 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:00.546 19:19:20 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:00.546 19:19:20 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:03.856 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:03.856 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:03.856 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:03.856 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:03.856 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:03.856 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:03.856 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:03.856 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:03.856 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:03.856 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:03.856 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:03.856 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:03.856 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:03.856 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:03.856 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:03.856 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:03.856 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:03.856 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:03.857 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:03.857 00:06:03.857 real 0m12.762s 00:06:03.857 user 0m3.708s 00:06:03.857 sys 0m6.977s 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:03.857 19:19:23 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:06:03.857 ************************************ 00:06:03.857 END TEST nvme_mount 00:06:03.857 ************************************ 00:06:04.116 19:19:23 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:06:04.116 19:19:23 setup.sh.devices -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:04.116 19:19:23 setup.sh.devices -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:04.116 19:19:23 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:04.116 ************************************ 00:06:04.116 START TEST dm_mount 00:06:04.116 ************************************ 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- common/autotest_common.sh@1129 -- # dm_mount 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:06:04.116 19:19:23 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:06:05.056 Creating new GPT entries in memory. 00:06:05.056 GPT data structures destroyed! You may now partition the disk using fdisk or 00:06:05.056 other utilities. 00:06:05.056 19:19:24 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:06:05.056 19:19:24 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:05.056 19:19:24 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:05.056 19:19:24 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:05.056 19:19:24 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:06:05.995 Creating new GPT entries in memory. 00:06:05.995 The operation has completed successfully. 00:06:05.995 19:19:25 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:05.995 19:19:25 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:05.995 19:19:25 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:05.995 19:19:25 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:05.995 19:19:25 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:06:07.376 The operation has completed successfully. 00:06:07.376 19:19:26 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:07.376 19:19:26 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:07.376 19:19:26 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 1720280 00:06:07.376 19:19:26 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:06:07.376 19:19:26 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:07.376 19:19:26 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:07.376 19:19:26 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:06:07.376 19:19:26 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:06:07.376 19:19:26 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:07.376 19:19:26 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:06:07.376 19:19:26 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:07.376 19:19:26 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:06:07.376 19:19:26 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:06:07.376 19:19:26 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:06:07.376 19:19:26 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:06:07.376 19:19:26 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:06:07.376 19:19:26 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:07.376 19:19:26 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:06:07.376 19:19:26 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:07.376 19:19:27 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:07.376 19:19:27 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:06:07.376 19:19:27 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:07.376 19:19:27 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:07.376 19:19:27 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:07.376 19:19:27 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:06:07.376 19:19:27 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:07.376 19:19:27 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:07.376 19:19:27 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:06:07.376 19:19:27 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:07.376 19:19:27 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:06:07.376 19:19:27 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:06:07.376 19:19:27 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:07.376 19:19:27 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:07.376 19:19:27 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:06:07.376 19:19:27 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:07.376 19:19:27 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:10.669 19:19:30 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:06:13.963 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:06:13.963 00:06:13.963 real 0m10.035s 00:06:13.963 user 0m2.423s 00:06:13.963 sys 0m4.664s 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:13.963 19:19:33 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:06:13.963 ************************************ 00:06:13.963 END TEST dm_mount 00:06:13.963 ************************************ 00:06:14.222 19:19:33 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:06:14.222 19:19:33 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:06:14.222 19:19:33 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:14.222 19:19:33 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:14.222 19:19:33 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:14.222 19:19:33 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:14.222 19:19:33 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:14.482 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:14.482 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:06:14.482 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:14.482 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:14.482 19:19:34 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:06:14.482 19:19:34 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:14.482 19:19:34 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:14.482 19:19:34 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:14.482 19:19:34 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:14.482 19:19:34 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:06:14.482 19:19:34 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:06:14.482 00:06:14.482 real 0m27.305s 00:06:14.482 user 0m7.675s 00:06:14.482 sys 0m14.518s 00:06:14.482 19:19:34 setup.sh.devices -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:14.482 19:19:34 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:14.482 ************************************ 00:06:14.482 END TEST devices 00:06:14.482 ************************************ 00:06:14.482 00:06:14.482 real 1m30.140s 00:06:14.482 user 0m27.205s 00:06:14.482 sys 0m51.560s 00:06:14.482 19:19:34 setup.sh -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:14.482 19:19:34 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:14.482 ************************************ 00:06:14.482 END TEST setup.sh 00:06:14.482 ************************************ 00:06:14.482 19:19:34 -- spdk/autotest.sh@115 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:06:17.768 Hugepages 00:06:17.768 node hugesize free / total 00:06:17.768 node0 1048576kB 0 / 0 00:06:17.768 node0 2048kB 1024 / 1024 00:06:17.768 node1 1048576kB 0 / 0 00:06:17.768 node1 2048kB 1024 / 1024 00:06:17.768 00:06:17.768 Type BDF Vendor Device NUMA Driver Device Block devices 00:06:17.768 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:06:17.769 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:06:17.769 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:06:17.769 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:06:17.769 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:06:17.769 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:06:17.769 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:06:17.769 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:06:17.769 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:06:17.769 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:06:17.769 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:06:17.769 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:06:17.769 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:06:17.769 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:06:17.769 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:06:17.769 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:06:17.769 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:06:17.769 19:19:37 -- spdk/autotest.sh@117 -- # uname -s 00:06:17.769 19:19:37 -- spdk/autotest.sh@117 -- # [[ Linux == Linux ]] 00:06:17.769 19:19:37 -- spdk/autotest.sh@119 -- # nvme_namespace_revert 00:06:17.769 19:19:37 -- common/autotest_common.sh@1516 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:21.064 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:21.064 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:21.064 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:21.064 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:21.064 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:21.064 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:21.064 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:21.064 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:21.064 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:21.064 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:21.064 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:21.064 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:21.064 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:21.064 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:21.064 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:21.064 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:22.444 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:06:22.444 19:19:42 -- common/autotest_common.sh@1517 -- # sleep 1 00:06:23.824 19:19:43 -- common/autotest_common.sh@1518 -- # bdfs=() 00:06:23.824 19:19:43 -- common/autotest_common.sh@1518 -- # local bdfs 00:06:23.824 19:19:43 -- common/autotest_common.sh@1520 -- # bdfs=($(get_nvme_bdfs)) 00:06:23.824 19:19:43 -- common/autotest_common.sh@1520 -- # get_nvme_bdfs 00:06:23.824 19:19:43 -- common/autotest_common.sh@1498 -- # bdfs=() 00:06:23.824 19:19:43 -- common/autotest_common.sh@1498 -- # local bdfs 00:06:23.824 19:19:43 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:23.824 19:19:43 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:23.824 19:19:43 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:06:23.824 19:19:43 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:06:23.824 19:19:43 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:06:23.824 19:19:43 -- common/autotest_common.sh@1522 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:06:27.116 Waiting for block devices as requested 00:06:27.116 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:27.116 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:27.116 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:27.116 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:27.116 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:27.116 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:27.375 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:27.375 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:27.375 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:27.375 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:27.635 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:27.635 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:27.635 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:27.895 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:27.895 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:27.895 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:28.155 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:06:28.414 19:19:48 -- common/autotest_common.sh@1524 -- # for bdf in "${bdfs[@]}" 00:06:28.414 19:19:48 -- common/autotest_common.sh@1525 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:06:28.414 19:19:48 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 00:06:28.414 19:19:48 -- common/autotest_common.sh@1487 -- # grep 0000:d8:00.0/nvme/nvme 00:06:28.414 19:19:48 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:06:28.414 19:19:48 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:06:28.414 19:19:48 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:06:28.414 19:19:48 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:06:28.414 19:19:48 -- common/autotest_common.sh@1525 -- # nvme_ctrlr=/dev/nvme0 00:06:28.414 19:19:48 -- common/autotest_common.sh@1526 -- # [[ -z /dev/nvme0 ]] 00:06:28.414 19:19:48 -- common/autotest_common.sh@1531 -- # nvme id-ctrl /dev/nvme0 00:06:28.414 19:19:48 -- common/autotest_common.sh@1531 -- # cut -d: -f2 00:06:28.414 19:19:48 -- common/autotest_common.sh@1531 -- # grep oacs 00:06:28.414 19:19:48 -- common/autotest_common.sh@1531 -- # oacs=' 0xe' 00:06:28.414 19:19:48 -- common/autotest_common.sh@1532 -- # oacs_ns_manage=8 00:06:28.414 19:19:48 -- common/autotest_common.sh@1534 -- # [[ 8 -ne 0 ]] 00:06:28.414 19:19:48 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:06:28.414 19:19:48 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:06:28.414 19:19:48 -- common/autotest_common.sh@1540 -- # grep unvmcap 00:06:28.414 19:19:48 -- common/autotest_common.sh@1540 -- # unvmcap=' 0' 00:06:28.414 19:19:48 -- common/autotest_common.sh@1541 -- # [[ 0 -eq 0 ]] 00:06:28.414 19:19:48 -- common/autotest_common.sh@1543 -- # continue 00:06:28.414 19:19:48 -- spdk/autotest.sh@122 -- # timing_exit pre_cleanup 00:06:28.414 19:19:48 -- common/autotest_common.sh@732 -- # xtrace_disable 00:06:28.414 19:19:48 -- common/autotest_common.sh@10 -- # set +x 00:06:28.414 19:19:48 -- spdk/autotest.sh@125 -- # timing_enter afterboot 00:06:28.414 19:19:48 -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:28.414 19:19:48 -- common/autotest_common.sh@10 -- # set +x 00:06:28.414 19:19:48 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:31.708 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:31.708 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:31.708 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:31.708 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:31.708 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:31.708 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:31.708 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:31.708 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:31.708 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:31.974 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:31.974 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:31.974 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:31.974 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:31.974 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:31.974 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:31.974 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:33.356 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:06:33.615 19:19:53 -- spdk/autotest.sh@127 -- # timing_exit afterboot 00:06:33.615 19:19:53 -- common/autotest_common.sh@732 -- # xtrace_disable 00:06:33.615 19:19:53 -- common/autotest_common.sh@10 -- # set +x 00:06:33.615 19:19:53 -- spdk/autotest.sh@131 -- # opal_revert_cleanup 00:06:33.615 19:19:53 -- common/autotest_common.sh@1578 -- # mapfile -t bdfs 00:06:33.615 19:19:53 -- common/autotest_common.sh@1578 -- # get_nvme_bdfs_by_id 0x0a54 00:06:33.615 19:19:53 -- common/autotest_common.sh@1563 -- # bdfs=() 00:06:33.615 19:19:53 -- common/autotest_common.sh@1563 -- # _bdfs=() 00:06:33.615 19:19:53 -- common/autotest_common.sh@1563 -- # local bdfs _bdfs 00:06:33.615 19:19:53 -- common/autotest_common.sh@1564 -- # _bdfs=($(get_nvme_bdfs)) 00:06:33.615 19:19:53 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:06:33.615 19:19:53 -- common/autotest_common.sh@1498 -- # bdfs=() 00:06:33.615 19:19:53 -- common/autotest_common.sh@1498 -- # local bdfs 00:06:33.615 19:19:53 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:33.615 19:19:53 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:33.615 19:19:53 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:06:33.615 19:19:53 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:06:33.615 19:19:53 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:06:33.615 19:19:53 -- common/autotest_common.sh@1565 -- # for bdf in "${_bdfs[@]}" 00:06:33.615 19:19:53 -- common/autotest_common.sh@1566 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:06:33.615 19:19:53 -- common/autotest_common.sh@1566 -- # device=0x0a54 00:06:33.615 19:19:53 -- common/autotest_common.sh@1567 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:06:33.615 19:19:53 -- common/autotest_common.sh@1568 -- # bdfs+=($bdf) 00:06:33.615 19:19:53 -- common/autotest_common.sh@1572 -- # (( 1 > 0 )) 00:06:33.615 19:19:53 -- common/autotest_common.sh@1573 -- # printf '%s\n' 0000:d8:00.0 00:06:33.615 19:19:53 -- common/autotest_common.sh@1579 -- # [[ -z 0000:d8:00.0 ]] 00:06:33.615 19:19:53 -- common/autotest_common.sh@1584 -- # spdk_tgt_pid=1730034 00:06:33.615 19:19:53 -- common/autotest_common.sh@1585 -- # waitforlisten 1730034 00:06:33.615 19:19:53 -- common/autotest_common.sh@1583 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:33.615 19:19:53 -- common/autotest_common.sh@835 -- # '[' -z 1730034 ']' 00:06:33.615 19:19:53 -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:33.615 19:19:53 -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:33.615 19:19:53 -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:33.615 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:33.615 19:19:53 -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:33.615 19:19:53 -- common/autotest_common.sh@10 -- # set +x 00:06:33.615 [2024-11-29 19:19:53.476518] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:06:33.615 [2024-11-29 19:19:53.476612] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1730034 ] 00:06:33.874 [2024-11-29 19:19:53.549225] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:33.874 [2024-11-29 19:19:53.572391] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.874 19:19:53 -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:33.874 19:19:53 -- common/autotest_common.sh@868 -- # return 0 00:06:33.874 19:19:53 -- common/autotest_common.sh@1587 -- # bdf_id=0 00:06:33.874 19:19:53 -- common/autotest_common.sh@1588 -- # for bdf in "${bdfs[@]}" 00:06:33.874 19:19:53 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:06:37.161 nvme0n1 00:06:37.161 19:19:56 -- common/autotest_common.sh@1591 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:06:37.161 [2024-11-29 19:19:56.959361] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:06:37.161 request: 00:06:37.161 { 00:06:37.161 "nvme_ctrlr_name": "nvme0", 00:06:37.161 "password": "test", 00:06:37.161 "method": "bdev_nvme_opal_revert", 00:06:37.161 "req_id": 1 00:06:37.161 } 00:06:37.161 Got JSON-RPC error response 00:06:37.161 response: 00:06:37.161 { 00:06:37.161 "code": -32602, 00:06:37.161 "message": "Invalid parameters" 00:06:37.161 } 00:06:37.161 19:19:56 -- common/autotest_common.sh@1591 -- # true 00:06:37.161 19:19:56 -- common/autotest_common.sh@1592 -- # (( ++bdf_id )) 00:06:37.161 19:19:56 -- common/autotest_common.sh@1595 -- # killprocess 1730034 00:06:37.161 19:19:56 -- common/autotest_common.sh@954 -- # '[' -z 1730034 ']' 00:06:37.161 19:19:56 -- common/autotest_common.sh@958 -- # kill -0 1730034 00:06:37.161 19:19:56 -- common/autotest_common.sh@959 -- # uname 00:06:37.161 19:19:56 -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:37.161 19:19:56 -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1730034 00:06:37.161 19:19:57 -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:37.161 19:19:57 -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:37.161 19:19:57 -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1730034' 00:06:37.161 killing process with pid 1730034 00:06:37.161 19:19:57 -- common/autotest_common.sh@973 -- # kill 1730034 00:06:37.161 19:19:57 -- common/autotest_common.sh@978 -- # wait 1730034 00:06:39.693 19:19:59 -- spdk/autotest.sh@137 -- # '[' 0 -eq 1 ']' 00:06:39.693 19:19:59 -- spdk/autotest.sh@141 -- # '[' 1 -eq 1 ']' 00:06:39.693 19:19:59 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:06:39.693 19:19:59 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:06:39.693 19:19:59 -- spdk/autotest.sh@149 -- # timing_enter lib 00:06:39.693 19:19:59 -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:39.693 19:19:59 -- common/autotest_common.sh@10 -- # set +x 00:06:39.693 19:19:59 -- spdk/autotest.sh@151 -- # [[ 0 -eq 1 ]] 00:06:39.693 19:19:59 -- spdk/autotest.sh@155 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:39.693 19:19:59 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:39.693 19:19:59 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:39.693 19:19:59 -- common/autotest_common.sh@10 -- # set +x 00:06:39.693 ************************************ 00:06:39.693 START TEST env 00:06:39.693 ************************************ 00:06:39.693 19:19:59 env -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:39.693 * Looking for test storage... 00:06:39.693 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:06:39.693 19:19:59 env -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:39.693 19:19:59 env -- common/autotest_common.sh@1693 -- # lcov --version 00:06:39.693 19:19:59 env -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:39.693 19:19:59 env -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:39.693 19:19:59 env -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:39.693 19:19:59 env -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:39.693 19:19:59 env -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:39.693 19:19:59 env -- scripts/common.sh@336 -- # IFS=.-: 00:06:39.693 19:19:59 env -- scripts/common.sh@336 -- # read -ra ver1 00:06:39.693 19:19:59 env -- scripts/common.sh@337 -- # IFS=.-: 00:06:39.693 19:19:59 env -- scripts/common.sh@337 -- # read -ra ver2 00:06:39.693 19:19:59 env -- scripts/common.sh@338 -- # local 'op=<' 00:06:39.693 19:19:59 env -- scripts/common.sh@340 -- # ver1_l=2 00:06:39.693 19:19:59 env -- scripts/common.sh@341 -- # ver2_l=1 00:06:39.693 19:19:59 env -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:39.693 19:19:59 env -- scripts/common.sh@344 -- # case "$op" in 00:06:39.693 19:19:59 env -- scripts/common.sh@345 -- # : 1 00:06:39.693 19:19:59 env -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:39.693 19:19:59 env -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:39.693 19:19:59 env -- scripts/common.sh@365 -- # decimal 1 00:06:39.693 19:19:59 env -- scripts/common.sh@353 -- # local d=1 00:06:39.693 19:19:59 env -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:39.693 19:19:59 env -- scripts/common.sh@355 -- # echo 1 00:06:39.693 19:19:59 env -- scripts/common.sh@365 -- # ver1[v]=1 00:06:39.693 19:19:59 env -- scripts/common.sh@366 -- # decimal 2 00:06:39.693 19:19:59 env -- scripts/common.sh@353 -- # local d=2 00:06:39.693 19:19:59 env -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:39.694 19:19:59 env -- scripts/common.sh@355 -- # echo 2 00:06:39.694 19:19:59 env -- scripts/common.sh@366 -- # ver2[v]=2 00:06:39.694 19:19:59 env -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:39.694 19:19:59 env -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:39.694 19:19:59 env -- scripts/common.sh@368 -- # return 0 00:06:39.694 19:19:59 env -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:39.694 19:19:59 env -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:39.694 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:39.694 --rc genhtml_branch_coverage=1 00:06:39.694 --rc genhtml_function_coverage=1 00:06:39.694 --rc genhtml_legend=1 00:06:39.694 --rc geninfo_all_blocks=1 00:06:39.694 --rc geninfo_unexecuted_blocks=1 00:06:39.694 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:39.694 ' 00:06:39.694 19:19:59 env -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:39.694 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:39.694 --rc genhtml_branch_coverage=1 00:06:39.694 --rc genhtml_function_coverage=1 00:06:39.694 --rc genhtml_legend=1 00:06:39.694 --rc geninfo_all_blocks=1 00:06:39.694 --rc geninfo_unexecuted_blocks=1 00:06:39.694 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:39.694 ' 00:06:39.694 19:19:59 env -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:39.694 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:39.694 --rc genhtml_branch_coverage=1 00:06:39.694 --rc genhtml_function_coverage=1 00:06:39.694 --rc genhtml_legend=1 00:06:39.694 --rc geninfo_all_blocks=1 00:06:39.694 --rc geninfo_unexecuted_blocks=1 00:06:39.694 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:39.694 ' 00:06:39.694 19:19:59 env -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:39.694 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:39.694 --rc genhtml_branch_coverage=1 00:06:39.694 --rc genhtml_function_coverage=1 00:06:39.694 --rc genhtml_legend=1 00:06:39.694 --rc geninfo_all_blocks=1 00:06:39.694 --rc geninfo_unexecuted_blocks=1 00:06:39.694 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:39.694 ' 00:06:39.694 19:19:59 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:39.694 19:19:59 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:39.694 19:19:59 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:39.694 19:19:59 env -- common/autotest_common.sh@10 -- # set +x 00:06:39.694 ************************************ 00:06:39.694 START TEST env_memory 00:06:39.694 ************************************ 00:06:39.694 19:19:59 env.env_memory -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:39.694 00:06:39.694 00:06:39.694 CUnit - A unit testing framework for C - Version 2.1-3 00:06:39.694 http://cunit.sourceforge.net/ 00:06:39.694 00:06:39.694 00:06:39.694 Suite: memory 00:06:39.694 Test: alloc and free memory map ...[2024-11-29 19:19:59.400997] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:39.694 passed 00:06:39.694 Test: mem map translation ...[2024-11-29 19:19:59.413716] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:39.694 [2024-11-29 19:19:59.413737] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:39.694 [2024-11-29 19:19:59.413767] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 589:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:39.694 [2024-11-29 19:19:59.413777] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 605:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:39.694 passed 00:06:39.694 Test: mem map registration ...[2024-11-29 19:19:59.433795] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=200000 len=1234 00:06:39.694 [2024-11-29 19:19:59.433813] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=4d2 len=2097152 00:06:39.694 passed 00:06:39.694 Test: mem map adjacent registrations ...passed 00:06:39.694 00:06:39.694 Run Summary: Type Total Ran Passed Failed Inactive 00:06:39.694 suites 1 1 n/a 0 0 00:06:39.694 tests 4 4 4 0 0 00:06:39.694 asserts 152 152 152 0 n/a 00:06:39.694 00:06:39.694 Elapsed time = 0.082 seconds 00:06:39.694 00:06:39.694 real 0m0.095s 00:06:39.694 user 0m0.084s 00:06:39.694 sys 0m0.011s 00:06:39.694 19:19:59 env.env_memory -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:39.694 19:19:59 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:06:39.694 ************************************ 00:06:39.694 END TEST env_memory 00:06:39.694 ************************************ 00:06:39.694 19:19:59 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:39.694 19:19:59 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:39.694 19:19:59 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:39.694 19:19:59 env -- common/autotest_common.sh@10 -- # set +x 00:06:39.694 ************************************ 00:06:39.694 START TEST env_vtophys 00:06:39.694 ************************************ 00:06:39.694 19:19:59 env.env_vtophys -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:39.694 EAL: lib.eal log level changed from notice to debug 00:06:39.694 EAL: Detected lcore 0 as core 0 on socket 0 00:06:39.694 EAL: Detected lcore 1 as core 1 on socket 0 00:06:39.694 EAL: Detected lcore 2 as core 2 on socket 0 00:06:39.694 EAL: Detected lcore 3 as core 3 on socket 0 00:06:39.694 EAL: Detected lcore 4 as core 4 on socket 0 00:06:39.694 EAL: Detected lcore 5 as core 5 on socket 0 00:06:39.694 EAL: Detected lcore 6 as core 6 on socket 0 00:06:39.694 EAL: Detected lcore 7 as core 8 on socket 0 00:06:39.694 EAL: Detected lcore 8 as core 9 on socket 0 00:06:39.694 EAL: Detected lcore 9 as core 10 on socket 0 00:06:39.694 EAL: Detected lcore 10 as core 11 on socket 0 00:06:39.694 EAL: Detected lcore 11 as core 12 on socket 0 00:06:39.694 EAL: Detected lcore 12 as core 13 on socket 0 00:06:39.694 EAL: Detected lcore 13 as core 14 on socket 0 00:06:39.694 EAL: Detected lcore 14 as core 16 on socket 0 00:06:39.694 EAL: Detected lcore 15 as core 17 on socket 0 00:06:39.694 EAL: Detected lcore 16 as core 18 on socket 0 00:06:39.694 EAL: Detected lcore 17 as core 19 on socket 0 00:06:39.694 EAL: Detected lcore 18 as core 20 on socket 0 00:06:39.694 EAL: Detected lcore 19 as core 21 on socket 0 00:06:39.694 EAL: Detected lcore 20 as core 22 on socket 0 00:06:39.694 EAL: Detected lcore 21 as core 24 on socket 0 00:06:39.694 EAL: Detected lcore 22 as core 25 on socket 0 00:06:39.694 EAL: Detected lcore 23 as core 26 on socket 0 00:06:39.694 EAL: Detected lcore 24 as core 27 on socket 0 00:06:39.694 EAL: Detected lcore 25 as core 28 on socket 0 00:06:39.694 EAL: Detected lcore 26 as core 29 on socket 0 00:06:39.694 EAL: Detected lcore 27 as core 30 on socket 0 00:06:39.694 EAL: Detected lcore 28 as core 0 on socket 1 00:06:39.694 EAL: Detected lcore 29 as core 1 on socket 1 00:06:39.694 EAL: Detected lcore 30 as core 2 on socket 1 00:06:39.694 EAL: Detected lcore 31 as core 3 on socket 1 00:06:39.694 EAL: Detected lcore 32 as core 4 on socket 1 00:06:39.694 EAL: Detected lcore 33 as core 5 on socket 1 00:06:39.694 EAL: Detected lcore 34 as core 6 on socket 1 00:06:39.694 EAL: Detected lcore 35 as core 8 on socket 1 00:06:39.694 EAL: Detected lcore 36 as core 9 on socket 1 00:06:39.694 EAL: Detected lcore 37 as core 10 on socket 1 00:06:39.694 EAL: Detected lcore 38 as core 11 on socket 1 00:06:39.694 EAL: Detected lcore 39 as core 12 on socket 1 00:06:39.694 EAL: Detected lcore 40 as core 13 on socket 1 00:06:39.694 EAL: Detected lcore 41 as core 14 on socket 1 00:06:39.694 EAL: Detected lcore 42 as core 16 on socket 1 00:06:39.694 EAL: Detected lcore 43 as core 17 on socket 1 00:06:39.694 EAL: Detected lcore 44 as core 18 on socket 1 00:06:39.694 EAL: Detected lcore 45 as core 19 on socket 1 00:06:39.694 EAL: Detected lcore 46 as core 20 on socket 1 00:06:39.694 EAL: Detected lcore 47 as core 21 on socket 1 00:06:39.694 EAL: Detected lcore 48 as core 22 on socket 1 00:06:39.694 EAL: Detected lcore 49 as core 24 on socket 1 00:06:39.694 EAL: Detected lcore 50 as core 25 on socket 1 00:06:39.694 EAL: Detected lcore 51 as core 26 on socket 1 00:06:39.694 EAL: Detected lcore 52 as core 27 on socket 1 00:06:39.694 EAL: Detected lcore 53 as core 28 on socket 1 00:06:39.694 EAL: Detected lcore 54 as core 29 on socket 1 00:06:39.694 EAL: Detected lcore 55 as core 30 on socket 1 00:06:39.694 EAL: Detected lcore 56 as core 0 on socket 0 00:06:39.694 EAL: Detected lcore 57 as core 1 on socket 0 00:06:39.694 EAL: Detected lcore 58 as core 2 on socket 0 00:06:39.694 EAL: Detected lcore 59 as core 3 on socket 0 00:06:39.694 EAL: Detected lcore 60 as core 4 on socket 0 00:06:39.694 EAL: Detected lcore 61 as core 5 on socket 0 00:06:39.694 EAL: Detected lcore 62 as core 6 on socket 0 00:06:39.694 EAL: Detected lcore 63 as core 8 on socket 0 00:06:39.694 EAL: Detected lcore 64 as core 9 on socket 0 00:06:39.694 EAL: Detected lcore 65 as core 10 on socket 0 00:06:39.694 EAL: Detected lcore 66 as core 11 on socket 0 00:06:39.694 EAL: Detected lcore 67 as core 12 on socket 0 00:06:39.694 EAL: Detected lcore 68 as core 13 on socket 0 00:06:39.694 EAL: Detected lcore 69 as core 14 on socket 0 00:06:39.694 EAL: Detected lcore 70 as core 16 on socket 0 00:06:39.694 EAL: Detected lcore 71 as core 17 on socket 0 00:06:39.694 EAL: Detected lcore 72 as core 18 on socket 0 00:06:39.694 EAL: Detected lcore 73 as core 19 on socket 0 00:06:39.694 EAL: Detected lcore 74 as core 20 on socket 0 00:06:39.694 EAL: Detected lcore 75 as core 21 on socket 0 00:06:39.694 EAL: Detected lcore 76 as core 22 on socket 0 00:06:39.694 EAL: Detected lcore 77 as core 24 on socket 0 00:06:39.694 EAL: Detected lcore 78 as core 25 on socket 0 00:06:39.694 EAL: Detected lcore 79 as core 26 on socket 0 00:06:39.694 EAL: Detected lcore 80 as core 27 on socket 0 00:06:39.694 EAL: Detected lcore 81 as core 28 on socket 0 00:06:39.694 EAL: Detected lcore 82 as core 29 on socket 0 00:06:39.695 EAL: Detected lcore 83 as core 30 on socket 0 00:06:39.695 EAL: Detected lcore 84 as core 0 on socket 1 00:06:39.695 EAL: Detected lcore 85 as core 1 on socket 1 00:06:39.695 EAL: Detected lcore 86 as core 2 on socket 1 00:06:39.695 EAL: Detected lcore 87 as core 3 on socket 1 00:06:39.695 EAL: Detected lcore 88 as core 4 on socket 1 00:06:39.695 EAL: Detected lcore 89 as core 5 on socket 1 00:06:39.695 EAL: Detected lcore 90 as core 6 on socket 1 00:06:39.695 EAL: Detected lcore 91 as core 8 on socket 1 00:06:39.695 EAL: Detected lcore 92 as core 9 on socket 1 00:06:39.695 EAL: Detected lcore 93 as core 10 on socket 1 00:06:39.695 EAL: Detected lcore 94 as core 11 on socket 1 00:06:39.695 EAL: Detected lcore 95 as core 12 on socket 1 00:06:39.695 EAL: Detected lcore 96 as core 13 on socket 1 00:06:39.695 EAL: Detected lcore 97 as core 14 on socket 1 00:06:39.695 EAL: Detected lcore 98 as core 16 on socket 1 00:06:39.695 EAL: Detected lcore 99 as core 17 on socket 1 00:06:39.695 EAL: Detected lcore 100 as core 18 on socket 1 00:06:39.695 EAL: Detected lcore 101 as core 19 on socket 1 00:06:39.695 EAL: Detected lcore 102 as core 20 on socket 1 00:06:39.695 EAL: Detected lcore 103 as core 21 on socket 1 00:06:39.695 EAL: Detected lcore 104 as core 22 on socket 1 00:06:39.695 EAL: Detected lcore 105 as core 24 on socket 1 00:06:39.695 EAL: Detected lcore 106 as core 25 on socket 1 00:06:39.695 EAL: Detected lcore 107 as core 26 on socket 1 00:06:39.695 EAL: Detected lcore 108 as core 27 on socket 1 00:06:39.695 EAL: Detected lcore 109 as core 28 on socket 1 00:06:39.695 EAL: Detected lcore 110 as core 29 on socket 1 00:06:39.695 EAL: Detected lcore 111 as core 30 on socket 1 00:06:39.695 EAL: Maximum logical cores by configuration: 128 00:06:39.695 EAL: Detected CPU lcores: 112 00:06:39.695 EAL: Detected NUMA nodes: 2 00:06:39.695 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:06:39.695 EAL: Checking presence of .so 'librte_eal.so.24' 00:06:39.695 EAL: Checking presence of .so 'librte_eal.so' 00:06:39.695 EAL: Detected static linkage of DPDK 00:06:39.695 EAL: No shared files mode enabled, IPC will be disabled 00:06:39.954 EAL: Bus pci wants IOVA as 'DC' 00:06:39.954 EAL: Buses did not request a specific IOVA mode. 00:06:39.954 EAL: IOMMU is available, selecting IOVA as VA mode. 00:06:39.954 EAL: Selected IOVA mode 'VA' 00:06:39.954 EAL: Probing VFIO support... 00:06:39.954 EAL: IOMMU type 1 (Type 1) is supported 00:06:39.954 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:39.954 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:39.954 EAL: VFIO support initialized 00:06:39.954 EAL: Ask a virtual area of 0x2e000 bytes 00:06:39.954 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:39.954 EAL: Setting up physically contiguous memory... 00:06:39.954 EAL: Setting maximum number of open files to 524288 00:06:39.954 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:39.954 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:39.954 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:39.954 EAL: Ask a virtual area of 0x61000 bytes 00:06:39.954 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:39.954 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:39.954 EAL: Ask a virtual area of 0x400000000 bytes 00:06:39.954 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:39.954 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:39.954 EAL: Ask a virtual area of 0x61000 bytes 00:06:39.954 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:39.954 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:39.954 EAL: Ask a virtual area of 0x400000000 bytes 00:06:39.954 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:39.954 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:39.954 EAL: Ask a virtual area of 0x61000 bytes 00:06:39.954 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:39.954 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:39.954 EAL: Ask a virtual area of 0x400000000 bytes 00:06:39.954 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:39.954 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:39.954 EAL: Ask a virtual area of 0x61000 bytes 00:06:39.954 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:39.954 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:39.954 EAL: Ask a virtual area of 0x400000000 bytes 00:06:39.954 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:39.954 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:39.954 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:39.954 EAL: Ask a virtual area of 0x61000 bytes 00:06:39.954 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:39.954 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:39.954 EAL: Ask a virtual area of 0x400000000 bytes 00:06:39.954 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:39.954 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:39.954 EAL: Ask a virtual area of 0x61000 bytes 00:06:39.954 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:39.954 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:39.954 EAL: Ask a virtual area of 0x400000000 bytes 00:06:39.954 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:39.954 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:39.954 EAL: Ask a virtual area of 0x61000 bytes 00:06:39.954 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:39.954 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:39.954 EAL: Ask a virtual area of 0x400000000 bytes 00:06:39.954 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:39.954 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:39.954 EAL: Ask a virtual area of 0x61000 bytes 00:06:39.954 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:39.954 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:39.954 EAL: Ask a virtual area of 0x400000000 bytes 00:06:39.954 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:39.954 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:39.954 EAL: Hugepages will be freed exactly as allocated. 00:06:39.954 EAL: No shared files mode enabled, IPC is disabled 00:06:39.954 EAL: No shared files mode enabled, IPC is disabled 00:06:39.954 EAL: TSC frequency is ~2500000 KHz 00:06:39.954 EAL: Main lcore 0 is ready (tid=7f8167fb9a00;cpuset=[0]) 00:06:39.954 EAL: Trying to obtain current memory policy. 00:06:39.954 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:39.954 EAL: Restoring previous memory policy: 0 00:06:39.954 EAL: request: mp_malloc_sync 00:06:39.954 EAL: No shared files mode enabled, IPC is disabled 00:06:39.954 EAL: Heap on socket 0 was expanded by 2MB 00:06:39.954 EAL: No shared files mode enabled, IPC is disabled 00:06:39.954 EAL: Mem event callback 'spdk:(nil)' registered 00:06:39.954 00:06:39.954 00:06:39.954 CUnit - A unit testing framework for C - Version 2.1-3 00:06:39.954 http://cunit.sourceforge.net/ 00:06:39.954 00:06:39.954 00:06:39.954 Suite: components_suite 00:06:39.954 Test: vtophys_malloc_test ...passed 00:06:39.954 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:39.954 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:39.954 EAL: Restoring previous memory policy: 4 00:06:39.954 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.954 EAL: request: mp_malloc_sync 00:06:39.954 EAL: No shared files mode enabled, IPC is disabled 00:06:39.954 EAL: Heap on socket 0 was expanded by 4MB 00:06:39.954 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.954 EAL: request: mp_malloc_sync 00:06:39.954 EAL: No shared files mode enabled, IPC is disabled 00:06:39.954 EAL: Heap on socket 0 was shrunk by 4MB 00:06:39.954 EAL: Trying to obtain current memory policy. 00:06:39.954 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:39.954 EAL: Restoring previous memory policy: 4 00:06:39.954 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.954 EAL: request: mp_malloc_sync 00:06:39.954 EAL: No shared files mode enabled, IPC is disabled 00:06:39.954 EAL: Heap on socket 0 was expanded by 6MB 00:06:39.954 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.954 EAL: request: mp_malloc_sync 00:06:39.954 EAL: No shared files mode enabled, IPC is disabled 00:06:39.954 EAL: Heap on socket 0 was shrunk by 6MB 00:06:39.954 EAL: Trying to obtain current memory policy. 00:06:39.954 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:39.954 EAL: Restoring previous memory policy: 4 00:06:39.954 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.954 EAL: request: mp_malloc_sync 00:06:39.954 EAL: No shared files mode enabled, IPC is disabled 00:06:39.954 EAL: Heap on socket 0 was expanded by 10MB 00:06:39.954 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.954 EAL: request: mp_malloc_sync 00:06:39.954 EAL: No shared files mode enabled, IPC is disabled 00:06:39.954 EAL: Heap on socket 0 was shrunk by 10MB 00:06:39.954 EAL: Trying to obtain current memory policy. 00:06:39.954 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:39.954 EAL: Restoring previous memory policy: 4 00:06:39.954 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.954 EAL: request: mp_malloc_sync 00:06:39.954 EAL: No shared files mode enabled, IPC is disabled 00:06:39.954 EAL: Heap on socket 0 was expanded by 18MB 00:06:39.954 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.954 EAL: request: mp_malloc_sync 00:06:39.954 EAL: No shared files mode enabled, IPC is disabled 00:06:39.954 EAL: Heap on socket 0 was shrunk by 18MB 00:06:39.954 EAL: Trying to obtain current memory policy. 00:06:39.954 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:39.954 EAL: Restoring previous memory policy: 4 00:06:39.954 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.954 EAL: request: mp_malloc_sync 00:06:39.954 EAL: No shared files mode enabled, IPC is disabled 00:06:39.954 EAL: Heap on socket 0 was expanded by 34MB 00:06:39.954 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.954 EAL: request: mp_malloc_sync 00:06:39.954 EAL: No shared files mode enabled, IPC is disabled 00:06:39.954 EAL: Heap on socket 0 was shrunk by 34MB 00:06:39.954 EAL: Trying to obtain current memory policy. 00:06:39.954 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:39.954 EAL: Restoring previous memory policy: 4 00:06:39.954 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.954 EAL: request: mp_malloc_sync 00:06:39.954 EAL: No shared files mode enabled, IPC is disabled 00:06:39.954 EAL: Heap on socket 0 was expanded by 66MB 00:06:39.954 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.954 EAL: request: mp_malloc_sync 00:06:39.954 EAL: No shared files mode enabled, IPC is disabled 00:06:39.954 EAL: Heap on socket 0 was shrunk by 66MB 00:06:39.954 EAL: Trying to obtain current memory policy. 00:06:39.954 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:39.955 EAL: Restoring previous memory policy: 4 00:06:39.955 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.955 EAL: request: mp_malloc_sync 00:06:39.955 EAL: No shared files mode enabled, IPC is disabled 00:06:39.955 EAL: Heap on socket 0 was expanded by 130MB 00:06:39.955 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.955 EAL: request: mp_malloc_sync 00:06:39.955 EAL: No shared files mode enabled, IPC is disabled 00:06:39.955 EAL: Heap on socket 0 was shrunk by 130MB 00:06:39.955 EAL: Trying to obtain current memory policy. 00:06:39.955 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:39.955 EAL: Restoring previous memory policy: 4 00:06:39.955 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.955 EAL: request: mp_malloc_sync 00:06:39.955 EAL: No shared files mode enabled, IPC is disabled 00:06:39.955 EAL: Heap on socket 0 was expanded by 258MB 00:06:39.955 EAL: Calling mem event callback 'spdk:(nil)' 00:06:40.213 EAL: request: mp_malloc_sync 00:06:40.213 EAL: No shared files mode enabled, IPC is disabled 00:06:40.213 EAL: Heap on socket 0 was shrunk by 258MB 00:06:40.213 EAL: Trying to obtain current memory policy. 00:06:40.213 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:40.213 EAL: Restoring previous memory policy: 4 00:06:40.213 EAL: Calling mem event callback 'spdk:(nil)' 00:06:40.213 EAL: request: mp_malloc_sync 00:06:40.213 EAL: No shared files mode enabled, IPC is disabled 00:06:40.213 EAL: Heap on socket 0 was expanded by 514MB 00:06:40.213 EAL: Calling mem event callback 'spdk:(nil)' 00:06:40.472 EAL: request: mp_malloc_sync 00:06:40.472 EAL: No shared files mode enabled, IPC is disabled 00:06:40.472 EAL: Heap on socket 0 was shrunk by 514MB 00:06:40.472 EAL: Trying to obtain current memory policy. 00:06:40.472 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:40.472 EAL: Restoring previous memory policy: 4 00:06:40.472 EAL: Calling mem event callback 'spdk:(nil)' 00:06:40.472 EAL: request: mp_malloc_sync 00:06:40.472 EAL: No shared files mode enabled, IPC is disabled 00:06:40.472 EAL: Heap on socket 0 was expanded by 1026MB 00:06:40.731 EAL: Calling mem event callback 'spdk:(nil)' 00:06:41.015 EAL: request: mp_malloc_sync 00:06:41.015 EAL: No shared files mode enabled, IPC is disabled 00:06:41.015 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:41.015 passed 00:06:41.015 00:06:41.015 Run Summary: Type Total Ran Passed Failed Inactive 00:06:41.015 suites 1 1 n/a 0 0 00:06:41.015 tests 2 2 2 0 0 00:06:41.015 asserts 497 497 497 0 n/a 00:06:41.015 00:06:41.015 Elapsed time = 0.974 seconds 00:06:41.015 EAL: Calling mem event callback 'spdk:(nil)' 00:06:41.015 EAL: request: mp_malloc_sync 00:06:41.015 EAL: No shared files mode enabled, IPC is disabled 00:06:41.015 EAL: Heap on socket 0 was shrunk by 2MB 00:06:41.015 EAL: No shared files mode enabled, IPC is disabled 00:06:41.015 EAL: No shared files mode enabled, IPC is disabled 00:06:41.015 EAL: No shared files mode enabled, IPC is disabled 00:06:41.015 00:06:41.015 real 0m1.101s 00:06:41.015 user 0m0.632s 00:06:41.015 sys 0m0.440s 00:06:41.015 19:20:00 env.env_vtophys -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:41.015 19:20:00 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:41.015 ************************************ 00:06:41.015 END TEST env_vtophys 00:06:41.015 ************************************ 00:06:41.015 19:20:00 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:41.015 19:20:00 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:41.015 19:20:00 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:41.015 19:20:00 env -- common/autotest_common.sh@10 -- # set +x 00:06:41.015 ************************************ 00:06:41.015 START TEST env_pci 00:06:41.015 ************************************ 00:06:41.015 19:20:00 env.env_pci -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:41.015 00:06:41.015 00:06:41.015 CUnit - A unit testing framework for C - Version 2.1-3 00:06:41.015 http://cunit.sourceforge.net/ 00:06:41.015 00:06:41.015 00:06:41.015 Suite: pci 00:06:41.015 Test: pci_hook ...[2024-11-29 19:20:00.752688] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1118:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1731332 has claimed it 00:06:41.015 EAL: Cannot find device (10000:00:01.0) 00:06:41.015 EAL: Failed to attach device on primary process 00:06:41.015 passed 00:06:41.015 00:06:41.015 Run Summary: Type Total Ran Passed Failed Inactive 00:06:41.015 suites 1 1 n/a 0 0 00:06:41.015 tests 1 1 1 0 0 00:06:41.016 asserts 25 25 25 0 n/a 00:06:41.016 00:06:41.016 Elapsed time = 0.036 seconds 00:06:41.016 00:06:41.016 real 0m0.055s 00:06:41.016 user 0m0.017s 00:06:41.016 sys 0m0.038s 00:06:41.016 19:20:00 env.env_pci -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:41.016 19:20:00 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:41.016 ************************************ 00:06:41.016 END TEST env_pci 00:06:41.016 ************************************ 00:06:41.016 19:20:00 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:41.016 19:20:00 env -- env/env.sh@15 -- # uname 00:06:41.016 19:20:00 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:41.016 19:20:00 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:41.016 19:20:00 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:41.016 19:20:00 env -- common/autotest_common.sh@1105 -- # '[' 5 -le 1 ']' 00:06:41.016 19:20:00 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:41.016 19:20:00 env -- common/autotest_common.sh@10 -- # set +x 00:06:41.016 ************************************ 00:06:41.016 START TEST env_dpdk_post_init 00:06:41.016 ************************************ 00:06:41.016 19:20:00 env.env_dpdk_post_init -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:41.431 EAL: Detected CPU lcores: 112 00:06:41.431 EAL: Detected NUMA nodes: 2 00:06:41.431 EAL: Detected static linkage of DPDK 00:06:41.431 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:41.431 EAL: Selected IOVA mode 'VA' 00:06:41.431 EAL: VFIO support initialized 00:06:41.431 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:41.431 EAL: Using IOMMU type 1 (Type 1) 00:06:42.003 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:06:46.198 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:06:46.198 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:06:46.198 Starting DPDK initialization... 00:06:46.198 Starting SPDK post initialization... 00:06:46.198 SPDK NVMe probe 00:06:46.198 Attaching to 0000:d8:00.0 00:06:46.198 Attached to 0000:d8:00.0 00:06:46.198 Cleaning up... 00:06:46.198 00:06:46.198 real 0m4.722s 00:06:46.198 user 0m3.545s 00:06:46.198 sys 0m0.420s 00:06:46.198 19:20:05 env.env_dpdk_post_init -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:46.198 19:20:05 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:46.198 ************************************ 00:06:46.198 END TEST env_dpdk_post_init 00:06:46.198 ************************************ 00:06:46.198 19:20:05 env -- env/env.sh@26 -- # uname 00:06:46.198 19:20:05 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:46.198 19:20:05 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:46.198 19:20:05 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:46.198 19:20:05 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:46.199 19:20:05 env -- common/autotest_common.sh@10 -- # set +x 00:06:46.199 ************************************ 00:06:46.199 START TEST env_mem_callbacks 00:06:46.199 ************************************ 00:06:46.199 19:20:05 env.env_mem_callbacks -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:46.199 EAL: Detected CPU lcores: 112 00:06:46.199 EAL: Detected NUMA nodes: 2 00:06:46.199 EAL: Detected static linkage of DPDK 00:06:46.199 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:46.199 EAL: Selected IOVA mode 'VA' 00:06:46.199 EAL: VFIO support initialized 00:06:46.199 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:46.199 00:06:46.199 00:06:46.199 CUnit - A unit testing framework for C - Version 2.1-3 00:06:46.199 http://cunit.sourceforge.net/ 00:06:46.199 00:06:46.199 00:06:46.199 Suite: memory 00:06:46.199 Test: test ... 00:06:46.199 register 0x200000200000 2097152 00:06:46.199 malloc 3145728 00:06:46.199 register 0x200000400000 4194304 00:06:46.199 buf 0x200000500000 len 3145728 PASSED 00:06:46.199 malloc 64 00:06:46.199 buf 0x2000004fff40 len 64 PASSED 00:06:46.199 malloc 4194304 00:06:46.199 register 0x200000800000 6291456 00:06:46.199 buf 0x200000a00000 len 4194304 PASSED 00:06:46.199 free 0x200000500000 3145728 00:06:46.199 free 0x2000004fff40 64 00:06:46.199 unregister 0x200000400000 4194304 PASSED 00:06:46.199 free 0x200000a00000 4194304 00:06:46.199 unregister 0x200000800000 6291456 PASSED 00:06:46.199 malloc 8388608 00:06:46.199 register 0x200000400000 10485760 00:06:46.199 buf 0x200000600000 len 8388608 PASSED 00:06:46.199 free 0x200000600000 8388608 00:06:46.199 unregister 0x200000400000 10485760 PASSED 00:06:46.199 passed 00:06:46.199 00:06:46.199 Run Summary: Type Total Ran Passed Failed Inactive 00:06:46.199 suites 1 1 n/a 0 0 00:06:46.199 tests 1 1 1 0 0 00:06:46.199 asserts 15 15 15 0 n/a 00:06:46.199 00:06:46.199 Elapsed time = 0.005 seconds 00:06:46.199 00:06:46.199 real 0m0.063s 00:06:46.199 user 0m0.023s 00:06:46.199 sys 0m0.040s 00:06:46.199 19:20:05 env.env_mem_callbacks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:46.199 19:20:05 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:46.199 ************************************ 00:06:46.199 END TEST env_mem_callbacks 00:06:46.199 ************************************ 00:06:46.199 00:06:46.199 real 0m6.654s 00:06:46.199 user 0m4.541s 00:06:46.199 sys 0m1.374s 00:06:46.199 19:20:05 env -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:46.199 19:20:05 env -- common/autotest_common.sh@10 -- # set +x 00:06:46.199 ************************************ 00:06:46.199 END TEST env 00:06:46.199 ************************************ 00:06:46.199 19:20:05 -- spdk/autotest.sh@156 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:46.199 19:20:05 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:46.199 19:20:05 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:46.199 19:20:05 -- common/autotest_common.sh@10 -- # set +x 00:06:46.199 ************************************ 00:06:46.199 START TEST rpc 00:06:46.199 ************************************ 00:06:46.199 19:20:05 rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:46.199 * Looking for test storage... 00:06:46.199 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:46.199 19:20:05 rpc -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:46.199 19:20:05 rpc -- common/autotest_common.sh@1693 -- # lcov --version 00:06:46.199 19:20:05 rpc -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:46.199 19:20:06 rpc -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:46.199 19:20:06 rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:46.199 19:20:06 rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:46.199 19:20:06 rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:46.199 19:20:06 rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:46.199 19:20:06 rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:46.199 19:20:06 rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:46.199 19:20:06 rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:46.199 19:20:06 rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:46.199 19:20:06 rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:46.199 19:20:06 rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:46.199 19:20:06 rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:46.199 19:20:06 rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:46.199 19:20:06 rpc -- scripts/common.sh@345 -- # : 1 00:06:46.199 19:20:06 rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:46.199 19:20:06 rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:46.199 19:20:06 rpc -- scripts/common.sh@365 -- # decimal 1 00:06:46.199 19:20:06 rpc -- scripts/common.sh@353 -- # local d=1 00:06:46.199 19:20:06 rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:46.199 19:20:06 rpc -- scripts/common.sh@355 -- # echo 1 00:06:46.199 19:20:06 rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:46.199 19:20:06 rpc -- scripts/common.sh@366 -- # decimal 2 00:06:46.199 19:20:06 rpc -- scripts/common.sh@353 -- # local d=2 00:06:46.199 19:20:06 rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:46.199 19:20:06 rpc -- scripts/common.sh@355 -- # echo 2 00:06:46.199 19:20:06 rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:46.199 19:20:06 rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:46.199 19:20:06 rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:46.199 19:20:06 rpc -- scripts/common.sh@368 -- # return 0 00:06:46.199 19:20:06 rpc -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:46.199 19:20:06 rpc -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:46.199 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:46.199 --rc genhtml_branch_coverage=1 00:06:46.199 --rc genhtml_function_coverage=1 00:06:46.199 --rc genhtml_legend=1 00:06:46.199 --rc geninfo_all_blocks=1 00:06:46.199 --rc geninfo_unexecuted_blocks=1 00:06:46.199 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:46.199 ' 00:06:46.199 19:20:06 rpc -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:46.199 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:46.199 --rc genhtml_branch_coverage=1 00:06:46.199 --rc genhtml_function_coverage=1 00:06:46.199 --rc genhtml_legend=1 00:06:46.199 --rc geninfo_all_blocks=1 00:06:46.199 --rc geninfo_unexecuted_blocks=1 00:06:46.199 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:46.199 ' 00:06:46.199 19:20:06 rpc -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:46.199 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:46.199 --rc genhtml_branch_coverage=1 00:06:46.199 --rc genhtml_function_coverage=1 00:06:46.199 --rc genhtml_legend=1 00:06:46.199 --rc geninfo_all_blocks=1 00:06:46.199 --rc geninfo_unexecuted_blocks=1 00:06:46.199 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:46.199 ' 00:06:46.199 19:20:06 rpc -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:46.199 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:46.199 --rc genhtml_branch_coverage=1 00:06:46.199 --rc genhtml_function_coverage=1 00:06:46.199 --rc genhtml_legend=1 00:06:46.199 --rc geninfo_all_blocks=1 00:06:46.199 --rc geninfo_unexecuted_blocks=1 00:06:46.199 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:46.199 ' 00:06:46.199 19:20:06 rpc -- rpc/rpc.sh@65 -- # spdk_pid=1732529 00:06:46.199 19:20:06 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:46.199 19:20:06 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:46.199 19:20:06 rpc -- rpc/rpc.sh@67 -- # waitforlisten 1732529 00:06:46.199 19:20:06 rpc -- common/autotest_common.sh@835 -- # '[' -z 1732529 ']' 00:06:46.199 19:20:06 rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:46.199 19:20:06 rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:46.199 19:20:06 rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:46.199 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:46.199 19:20:06 rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:46.199 19:20:06 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:46.199 [2024-11-29 19:20:06.087872] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:06:46.199 [2024-11-29 19:20:06.087944] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1732529 ] 00:06:46.458 [2024-11-29 19:20:06.156770] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.458 [2024-11-29 19:20:06.177733] app.c: 612:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:46.458 [2024-11-29 19:20:06.177769] app.c: 616:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1732529' to capture a snapshot of events at runtime. 00:06:46.458 [2024-11-29 19:20:06.177778] app.c: 618:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:46.458 [2024-11-29 19:20:06.177786] app.c: 619:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:46.458 [2024-11-29 19:20:06.177793] app.c: 620:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1732529 for offline analysis/debug. 00:06:46.458 [2024-11-29 19:20:06.178379] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.716 19:20:06 rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:46.716 19:20:06 rpc -- common/autotest_common.sh@868 -- # return 0 00:06:46.716 19:20:06 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:46.716 19:20:06 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:46.716 19:20:06 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:46.716 19:20:06 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:46.716 19:20:06 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:46.716 19:20:06 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:46.716 19:20:06 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:46.716 ************************************ 00:06:46.716 START TEST rpc_integrity 00:06:46.716 ************************************ 00:06:46.716 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@1129 -- # rpc_integrity 00:06:46.716 19:20:06 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:46.716 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.716 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:46.716 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.716 19:20:06 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:46.716 19:20:06 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:46.716 19:20:06 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:46.716 19:20:06 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:46.716 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.716 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:46.716 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.716 19:20:06 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:46.716 19:20:06 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:46.716 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.717 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:46.717 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.717 19:20:06 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:46.717 { 00:06:46.717 "name": "Malloc0", 00:06:46.717 "aliases": [ 00:06:46.717 "bd582744-d457-49ea-ab34-dfe4ed49c4d2" 00:06:46.717 ], 00:06:46.717 "product_name": "Malloc disk", 00:06:46.717 "block_size": 512, 00:06:46.717 "num_blocks": 16384, 00:06:46.717 "uuid": "bd582744-d457-49ea-ab34-dfe4ed49c4d2", 00:06:46.717 "assigned_rate_limits": { 00:06:46.717 "rw_ios_per_sec": 0, 00:06:46.717 "rw_mbytes_per_sec": 0, 00:06:46.717 "r_mbytes_per_sec": 0, 00:06:46.717 "w_mbytes_per_sec": 0 00:06:46.717 }, 00:06:46.717 "claimed": false, 00:06:46.717 "zoned": false, 00:06:46.717 "supported_io_types": { 00:06:46.717 "read": true, 00:06:46.717 "write": true, 00:06:46.717 "unmap": true, 00:06:46.717 "flush": true, 00:06:46.717 "reset": true, 00:06:46.717 "nvme_admin": false, 00:06:46.717 "nvme_io": false, 00:06:46.717 "nvme_io_md": false, 00:06:46.717 "write_zeroes": true, 00:06:46.717 "zcopy": true, 00:06:46.717 "get_zone_info": false, 00:06:46.717 "zone_management": false, 00:06:46.717 "zone_append": false, 00:06:46.717 "compare": false, 00:06:46.717 "compare_and_write": false, 00:06:46.717 "abort": true, 00:06:46.717 "seek_hole": false, 00:06:46.717 "seek_data": false, 00:06:46.717 "copy": true, 00:06:46.717 "nvme_iov_md": false 00:06:46.717 }, 00:06:46.717 "memory_domains": [ 00:06:46.717 { 00:06:46.717 "dma_device_id": "system", 00:06:46.717 "dma_device_type": 1 00:06:46.717 }, 00:06:46.717 { 00:06:46.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:46.717 "dma_device_type": 2 00:06:46.717 } 00:06:46.717 ], 00:06:46.717 "driver_specific": {} 00:06:46.717 } 00:06:46.717 ]' 00:06:46.717 19:20:06 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:46.717 19:20:06 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:46.717 19:20:06 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:46.717 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.717 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:46.717 [2024-11-29 19:20:06.549218] vbdev_passthru.c: 608:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:46.717 [2024-11-29 19:20:06.549248] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:46.717 [2024-11-29 19:20:06.549267] vbdev_passthru.c: 682:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x50a60a0 00:06:46.717 [2024-11-29 19:20:06.549277] vbdev_passthru.c: 697:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:46.717 [2024-11-29 19:20:06.550191] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:46.717 [2024-11-29 19:20:06.550214] vbdev_passthru.c: 711:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:46.717 Passthru0 00:06:46.717 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.717 19:20:06 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:46.717 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.717 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:46.717 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.717 19:20:06 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:46.717 { 00:06:46.717 "name": "Malloc0", 00:06:46.717 "aliases": [ 00:06:46.717 "bd582744-d457-49ea-ab34-dfe4ed49c4d2" 00:06:46.717 ], 00:06:46.717 "product_name": "Malloc disk", 00:06:46.717 "block_size": 512, 00:06:46.717 "num_blocks": 16384, 00:06:46.717 "uuid": "bd582744-d457-49ea-ab34-dfe4ed49c4d2", 00:06:46.717 "assigned_rate_limits": { 00:06:46.717 "rw_ios_per_sec": 0, 00:06:46.717 "rw_mbytes_per_sec": 0, 00:06:46.717 "r_mbytes_per_sec": 0, 00:06:46.717 "w_mbytes_per_sec": 0 00:06:46.717 }, 00:06:46.717 "claimed": true, 00:06:46.717 "claim_type": "exclusive_write", 00:06:46.717 "zoned": false, 00:06:46.717 "supported_io_types": { 00:06:46.717 "read": true, 00:06:46.717 "write": true, 00:06:46.717 "unmap": true, 00:06:46.717 "flush": true, 00:06:46.717 "reset": true, 00:06:46.717 "nvme_admin": false, 00:06:46.717 "nvme_io": false, 00:06:46.717 "nvme_io_md": false, 00:06:46.717 "write_zeroes": true, 00:06:46.717 "zcopy": true, 00:06:46.717 "get_zone_info": false, 00:06:46.717 "zone_management": false, 00:06:46.717 "zone_append": false, 00:06:46.717 "compare": false, 00:06:46.717 "compare_and_write": false, 00:06:46.717 "abort": true, 00:06:46.717 "seek_hole": false, 00:06:46.717 "seek_data": false, 00:06:46.717 "copy": true, 00:06:46.717 "nvme_iov_md": false 00:06:46.717 }, 00:06:46.717 "memory_domains": [ 00:06:46.717 { 00:06:46.717 "dma_device_id": "system", 00:06:46.717 "dma_device_type": 1 00:06:46.717 }, 00:06:46.717 { 00:06:46.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:46.717 "dma_device_type": 2 00:06:46.717 } 00:06:46.717 ], 00:06:46.717 "driver_specific": {} 00:06:46.717 }, 00:06:46.717 { 00:06:46.717 "name": "Passthru0", 00:06:46.717 "aliases": [ 00:06:46.717 "e3195cee-f7a4-5579-99da-c60b66a4d13c" 00:06:46.717 ], 00:06:46.717 "product_name": "passthru", 00:06:46.717 "block_size": 512, 00:06:46.717 "num_blocks": 16384, 00:06:46.717 "uuid": "e3195cee-f7a4-5579-99da-c60b66a4d13c", 00:06:46.717 "assigned_rate_limits": { 00:06:46.717 "rw_ios_per_sec": 0, 00:06:46.717 "rw_mbytes_per_sec": 0, 00:06:46.717 "r_mbytes_per_sec": 0, 00:06:46.717 "w_mbytes_per_sec": 0 00:06:46.717 }, 00:06:46.717 "claimed": false, 00:06:46.717 "zoned": false, 00:06:46.717 "supported_io_types": { 00:06:46.717 "read": true, 00:06:46.717 "write": true, 00:06:46.717 "unmap": true, 00:06:46.717 "flush": true, 00:06:46.717 "reset": true, 00:06:46.717 "nvme_admin": false, 00:06:46.717 "nvme_io": false, 00:06:46.717 "nvme_io_md": false, 00:06:46.717 "write_zeroes": true, 00:06:46.717 "zcopy": true, 00:06:46.717 "get_zone_info": false, 00:06:46.717 "zone_management": false, 00:06:46.717 "zone_append": false, 00:06:46.717 "compare": false, 00:06:46.717 "compare_and_write": false, 00:06:46.717 "abort": true, 00:06:46.717 "seek_hole": false, 00:06:46.717 "seek_data": false, 00:06:46.717 "copy": true, 00:06:46.717 "nvme_iov_md": false 00:06:46.717 }, 00:06:46.717 "memory_domains": [ 00:06:46.717 { 00:06:46.717 "dma_device_id": "system", 00:06:46.717 "dma_device_type": 1 00:06:46.717 }, 00:06:46.717 { 00:06:46.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:46.717 "dma_device_type": 2 00:06:46.717 } 00:06:46.717 ], 00:06:46.717 "driver_specific": { 00:06:46.717 "passthru": { 00:06:46.717 "name": "Passthru0", 00:06:46.717 "base_bdev_name": "Malloc0" 00:06:46.717 } 00:06:46.717 } 00:06:46.717 } 00:06:46.717 ]' 00:06:46.717 19:20:06 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:46.717 19:20:06 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:46.717 19:20:06 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:46.717 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.717 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:46.717 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.717 19:20:06 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:46.717 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.717 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:46.976 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.976 19:20:06 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:46.976 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.976 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:46.976 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.976 19:20:06 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:46.976 19:20:06 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:46.976 19:20:06 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:46.976 00:06:46.976 real 0m0.271s 00:06:46.976 user 0m0.163s 00:06:46.976 sys 0m0.045s 00:06:46.976 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:46.976 19:20:06 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:46.976 ************************************ 00:06:46.976 END TEST rpc_integrity 00:06:46.976 ************************************ 00:06:46.976 19:20:06 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:46.976 19:20:06 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:46.976 19:20:06 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:46.976 19:20:06 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:46.976 ************************************ 00:06:46.976 START TEST rpc_plugins 00:06:46.976 ************************************ 00:06:46.976 19:20:06 rpc.rpc_plugins -- common/autotest_common.sh@1129 -- # rpc_plugins 00:06:46.976 19:20:06 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:46.976 19:20:06 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.976 19:20:06 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:46.976 19:20:06 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.976 19:20:06 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:46.976 19:20:06 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:46.976 19:20:06 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.976 19:20:06 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:46.976 19:20:06 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.976 19:20:06 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:46.976 { 00:06:46.976 "name": "Malloc1", 00:06:46.976 "aliases": [ 00:06:46.976 "9c392a66-def3-4472-95b1-3977ed0ee884" 00:06:46.976 ], 00:06:46.976 "product_name": "Malloc disk", 00:06:46.976 "block_size": 4096, 00:06:46.976 "num_blocks": 256, 00:06:46.976 "uuid": "9c392a66-def3-4472-95b1-3977ed0ee884", 00:06:46.976 "assigned_rate_limits": { 00:06:46.976 "rw_ios_per_sec": 0, 00:06:46.976 "rw_mbytes_per_sec": 0, 00:06:46.976 "r_mbytes_per_sec": 0, 00:06:46.976 "w_mbytes_per_sec": 0 00:06:46.976 }, 00:06:46.976 "claimed": false, 00:06:46.976 "zoned": false, 00:06:46.976 "supported_io_types": { 00:06:46.976 "read": true, 00:06:46.976 "write": true, 00:06:46.976 "unmap": true, 00:06:46.976 "flush": true, 00:06:46.976 "reset": true, 00:06:46.976 "nvme_admin": false, 00:06:46.976 "nvme_io": false, 00:06:46.976 "nvme_io_md": false, 00:06:46.976 "write_zeroes": true, 00:06:46.976 "zcopy": true, 00:06:46.976 "get_zone_info": false, 00:06:46.976 "zone_management": false, 00:06:46.976 "zone_append": false, 00:06:46.976 "compare": false, 00:06:46.976 "compare_and_write": false, 00:06:46.976 "abort": true, 00:06:46.976 "seek_hole": false, 00:06:46.976 "seek_data": false, 00:06:46.976 "copy": true, 00:06:46.976 "nvme_iov_md": false 00:06:46.976 }, 00:06:46.976 "memory_domains": [ 00:06:46.976 { 00:06:46.976 "dma_device_id": "system", 00:06:46.976 "dma_device_type": 1 00:06:46.976 }, 00:06:46.976 { 00:06:46.976 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:46.976 "dma_device_type": 2 00:06:46.976 } 00:06:46.976 ], 00:06:46.976 "driver_specific": {} 00:06:46.976 } 00:06:46.976 ]' 00:06:46.976 19:20:06 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:46.976 19:20:06 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:46.976 19:20:06 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:46.976 19:20:06 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.976 19:20:06 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:46.976 19:20:06 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.976 19:20:06 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:46.976 19:20:06 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.976 19:20:06 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:46.976 19:20:06 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.976 19:20:06 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:46.976 19:20:06 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:47.235 19:20:06 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:47.235 00:06:47.235 real 0m0.136s 00:06:47.235 user 0m0.085s 00:06:47.235 sys 0m0.015s 00:06:47.235 19:20:06 rpc.rpc_plugins -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:47.235 19:20:06 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:47.235 ************************************ 00:06:47.235 END TEST rpc_plugins 00:06:47.235 ************************************ 00:06:47.235 19:20:06 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:47.235 19:20:06 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:47.235 19:20:06 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:47.235 19:20:06 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:47.235 ************************************ 00:06:47.235 START TEST rpc_trace_cmd_test 00:06:47.235 ************************************ 00:06:47.235 19:20:06 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1129 -- # rpc_trace_cmd_test 00:06:47.235 19:20:06 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:47.235 19:20:06 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:47.235 19:20:06 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:47.235 19:20:06 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:47.235 19:20:07 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:47.235 19:20:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:47.235 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1732529", 00:06:47.235 "tpoint_group_mask": "0x8", 00:06:47.235 "iscsi_conn": { 00:06:47.235 "mask": "0x2", 00:06:47.235 "tpoint_mask": "0x0" 00:06:47.235 }, 00:06:47.235 "scsi": { 00:06:47.235 "mask": "0x4", 00:06:47.235 "tpoint_mask": "0x0" 00:06:47.235 }, 00:06:47.235 "bdev": { 00:06:47.235 "mask": "0x8", 00:06:47.235 "tpoint_mask": "0xffffffffffffffff" 00:06:47.235 }, 00:06:47.235 "nvmf_rdma": { 00:06:47.235 "mask": "0x10", 00:06:47.235 "tpoint_mask": "0x0" 00:06:47.235 }, 00:06:47.236 "nvmf_tcp": { 00:06:47.236 "mask": "0x20", 00:06:47.236 "tpoint_mask": "0x0" 00:06:47.236 }, 00:06:47.236 "ftl": { 00:06:47.236 "mask": "0x40", 00:06:47.236 "tpoint_mask": "0x0" 00:06:47.236 }, 00:06:47.236 "blobfs": { 00:06:47.236 "mask": "0x80", 00:06:47.236 "tpoint_mask": "0x0" 00:06:47.236 }, 00:06:47.236 "dsa": { 00:06:47.236 "mask": "0x200", 00:06:47.236 "tpoint_mask": "0x0" 00:06:47.236 }, 00:06:47.236 "thread": { 00:06:47.236 "mask": "0x400", 00:06:47.236 "tpoint_mask": "0x0" 00:06:47.236 }, 00:06:47.236 "nvme_pcie": { 00:06:47.236 "mask": "0x800", 00:06:47.236 "tpoint_mask": "0x0" 00:06:47.236 }, 00:06:47.236 "iaa": { 00:06:47.236 "mask": "0x1000", 00:06:47.236 "tpoint_mask": "0x0" 00:06:47.236 }, 00:06:47.236 "nvme_tcp": { 00:06:47.236 "mask": "0x2000", 00:06:47.236 "tpoint_mask": "0x0" 00:06:47.236 }, 00:06:47.236 "bdev_nvme": { 00:06:47.236 "mask": "0x4000", 00:06:47.236 "tpoint_mask": "0x0" 00:06:47.236 }, 00:06:47.236 "sock": { 00:06:47.236 "mask": "0x8000", 00:06:47.236 "tpoint_mask": "0x0" 00:06:47.236 }, 00:06:47.236 "blob": { 00:06:47.236 "mask": "0x10000", 00:06:47.236 "tpoint_mask": "0x0" 00:06:47.236 }, 00:06:47.236 "bdev_raid": { 00:06:47.236 "mask": "0x20000", 00:06:47.236 "tpoint_mask": "0x0" 00:06:47.236 }, 00:06:47.236 "scheduler": { 00:06:47.236 "mask": "0x40000", 00:06:47.236 "tpoint_mask": "0x0" 00:06:47.236 } 00:06:47.236 }' 00:06:47.236 19:20:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:47.236 19:20:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 19 -gt 2 ']' 00:06:47.236 19:20:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:47.236 19:20:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:47.236 19:20:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:47.236 19:20:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:47.236 19:20:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:47.495 19:20:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:47.495 19:20:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:47.495 19:20:07 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:47.495 00:06:47.495 real 0m0.205s 00:06:47.495 user 0m0.162s 00:06:47.495 sys 0m0.034s 00:06:47.495 19:20:07 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:47.495 19:20:07 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:47.495 ************************************ 00:06:47.495 END TEST rpc_trace_cmd_test 00:06:47.495 ************************************ 00:06:47.495 19:20:07 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:47.495 19:20:07 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:47.495 19:20:07 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:47.495 19:20:07 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:47.495 19:20:07 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:47.495 19:20:07 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:47.495 ************************************ 00:06:47.495 START TEST rpc_daemon_integrity 00:06:47.495 ************************************ 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1129 -- # rpc_integrity 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:47.495 { 00:06:47.495 "name": "Malloc2", 00:06:47.495 "aliases": [ 00:06:47.495 "633efce0-eb69-4d77-8aa8-1d9e5b31252a" 00:06:47.495 ], 00:06:47.495 "product_name": "Malloc disk", 00:06:47.495 "block_size": 512, 00:06:47.495 "num_blocks": 16384, 00:06:47.495 "uuid": "633efce0-eb69-4d77-8aa8-1d9e5b31252a", 00:06:47.495 "assigned_rate_limits": { 00:06:47.495 "rw_ios_per_sec": 0, 00:06:47.495 "rw_mbytes_per_sec": 0, 00:06:47.495 "r_mbytes_per_sec": 0, 00:06:47.495 "w_mbytes_per_sec": 0 00:06:47.495 }, 00:06:47.495 "claimed": false, 00:06:47.495 "zoned": false, 00:06:47.495 "supported_io_types": { 00:06:47.495 "read": true, 00:06:47.495 "write": true, 00:06:47.495 "unmap": true, 00:06:47.495 "flush": true, 00:06:47.495 "reset": true, 00:06:47.495 "nvme_admin": false, 00:06:47.495 "nvme_io": false, 00:06:47.495 "nvme_io_md": false, 00:06:47.495 "write_zeroes": true, 00:06:47.495 "zcopy": true, 00:06:47.495 "get_zone_info": false, 00:06:47.495 "zone_management": false, 00:06:47.495 "zone_append": false, 00:06:47.495 "compare": false, 00:06:47.495 "compare_and_write": false, 00:06:47.495 "abort": true, 00:06:47.495 "seek_hole": false, 00:06:47.495 "seek_data": false, 00:06:47.495 "copy": true, 00:06:47.495 "nvme_iov_md": false 00:06:47.495 }, 00:06:47.495 "memory_domains": [ 00:06:47.495 { 00:06:47.495 "dma_device_id": "system", 00:06:47.495 "dma_device_type": 1 00:06:47.495 }, 00:06:47.495 { 00:06:47.495 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:47.495 "dma_device_type": 2 00:06:47.495 } 00:06:47.495 ], 00:06:47.495 "driver_specific": {} 00:06:47.495 } 00:06:47.495 ]' 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:47.495 [2024-11-29 19:20:07.387376] vbdev_passthru.c: 608:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:47.495 [2024-11-29 19:20:07.387406] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:47.495 [2024-11-29 19:20:07.387428] vbdev_passthru.c: 682:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x51c7820 00:06:47.495 [2024-11-29 19:20:07.387439] vbdev_passthru.c: 697:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:47.495 [2024-11-29 19:20:07.388188] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:47.495 [2024-11-29 19:20:07.388212] vbdev_passthru.c: 711:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:47.495 Passthru0 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:47.495 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:47.755 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:47.755 19:20:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:47.755 { 00:06:47.755 "name": "Malloc2", 00:06:47.755 "aliases": [ 00:06:47.755 "633efce0-eb69-4d77-8aa8-1d9e5b31252a" 00:06:47.755 ], 00:06:47.755 "product_name": "Malloc disk", 00:06:47.755 "block_size": 512, 00:06:47.755 "num_blocks": 16384, 00:06:47.755 "uuid": "633efce0-eb69-4d77-8aa8-1d9e5b31252a", 00:06:47.755 "assigned_rate_limits": { 00:06:47.755 "rw_ios_per_sec": 0, 00:06:47.755 "rw_mbytes_per_sec": 0, 00:06:47.755 "r_mbytes_per_sec": 0, 00:06:47.755 "w_mbytes_per_sec": 0 00:06:47.755 }, 00:06:47.755 "claimed": true, 00:06:47.755 "claim_type": "exclusive_write", 00:06:47.755 "zoned": false, 00:06:47.755 "supported_io_types": { 00:06:47.755 "read": true, 00:06:47.755 "write": true, 00:06:47.755 "unmap": true, 00:06:47.755 "flush": true, 00:06:47.755 "reset": true, 00:06:47.755 "nvme_admin": false, 00:06:47.755 "nvme_io": false, 00:06:47.755 "nvme_io_md": false, 00:06:47.755 "write_zeroes": true, 00:06:47.755 "zcopy": true, 00:06:47.755 "get_zone_info": false, 00:06:47.755 "zone_management": false, 00:06:47.755 "zone_append": false, 00:06:47.755 "compare": false, 00:06:47.755 "compare_and_write": false, 00:06:47.755 "abort": true, 00:06:47.755 "seek_hole": false, 00:06:47.755 "seek_data": false, 00:06:47.755 "copy": true, 00:06:47.755 "nvme_iov_md": false 00:06:47.755 }, 00:06:47.755 "memory_domains": [ 00:06:47.755 { 00:06:47.755 "dma_device_id": "system", 00:06:47.755 "dma_device_type": 1 00:06:47.755 }, 00:06:47.755 { 00:06:47.755 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:47.755 "dma_device_type": 2 00:06:47.755 } 00:06:47.755 ], 00:06:47.755 "driver_specific": {} 00:06:47.755 }, 00:06:47.755 { 00:06:47.755 "name": "Passthru0", 00:06:47.755 "aliases": [ 00:06:47.755 "1be13a7b-c83e-5233-9103-9cb98fbd1438" 00:06:47.755 ], 00:06:47.755 "product_name": "passthru", 00:06:47.755 "block_size": 512, 00:06:47.755 "num_blocks": 16384, 00:06:47.755 "uuid": "1be13a7b-c83e-5233-9103-9cb98fbd1438", 00:06:47.755 "assigned_rate_limits": { 00:06:47.755 "rw_ios_per_sec": 0, 00:06:47.755 "rw_mbytes_per_sec": 0, 00:06:47.755 "r_mbytes_per_sec": 0, 00:06:47.755 "w_mbytes_per_sec": 0 00:06:47.755 }, 00:06:47.755 "claimed": false, 00:06:47.755 "zoned": false, 00:06:47.755 "supported_io_types": { 00:06:47.755 "read": true, 00:06:47.755 "write": true, 00:06:47.755 "unmap": true, 00:06:47.755 "flush": true, 00:06:47.755 "reset": true, 00:06:47.755 "nvme_admin": false, 00:06:47.755 "nvme_io": false, 00:06:47.755 "nvme_io_md": false, 00:06:47.755 "write_zeroes": true, 00:06:47.755 "zcopy": true, 00:06:47.755 "get_zone_info": false, 00:06:47.755 "zone_management": false, 00:06:47.755 "zone_append": false, 00:06:47.755 "compare": false, 00:06:47.755 "compare_and_write": false, 00:06:47.755 "abort": true, 00:06:47.755 "seek_hole": false, 00:06:47.755 "seek_data": false, 00:06:47.755 "copy": true, 00:06:47.755 "nvme_iov_md": false 00:06:47.755 }, 00:06:47.755 "memory_domains": [ 00:06:47.755 { 00:06:47.755 "dma_device_id": "system", 00:06:47.755 "dma_device_type": 1 00:06:47.755 }, 00:06:47.755 { 00:06:47.755 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:47.755 "dma_device_type": 2 00:06:47.755 } 00:06:47.755 ], 00:06:47.755 "driver_specific": { 00:06:47.755 "passthru": { 00:06:47.755 "name": "Passthru0", 00:06:47.755 "base_bdev_name": "Malloc2" 00:06:47.755 } 00:06:47.755 } 00:06:47.755 } 00:06:47.755 ]' 00:06:47.755 19:20:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:47.755 19:20:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:47.755 19:20:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:47.755 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:47.755 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:47.755 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:47.755 19:20:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:47.755 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:47.755 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:47.755 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:47.755 19:20:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:47.755 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:47.755 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:47.755 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:47.755 19:20:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:47.755 19:20:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:47.755 19:20:07 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:47.755 00:06:47.755 real 0m0.266s 00:06:47.755 user 0m0.161s 00:06:47.755 sys 0m0.047s 00:06:47.755 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:47.755 19:20:07 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:47.755 ************************************ 00:06:47.755 END TEST rpc_daemon_integrity 00:06:47.755 ************************************ 00:06:47.755 19:20:07 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:47.755 19:20:07 rpc -- rpc/rpc.sh@84 -- # killprocess 1732529 00:06:47.756 19:20:07 rpc -- common/autotest_common.sh@954 -- # '[' -z 1732529 ']' 00:06:47.756 19:20:07 rpc -- common/autotest_common.sh@958 -- # kill -0 1732529 00:06:47.756 19:20:07 rpc -- common/autotest_common.sh@959 -- # uname 00:06:47.756 19:20:07 rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:47.756 19:20:07 rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1732529 00:06:47.756 19:20:07 rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:47.756 19:20:07 rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:47.756 19:20:07 rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1732529' 00:06:47.756 killing process with pid 1732529 00:06:47.756 19:20:07 rpc -- common/autotest_common.sh@973 -- # kill 1732529 00:06:47.756 19:20:07 rpc -- common/autotest_common.sh@978 -- # wait 1732529 00:06:48.015 00:06:48.015 real 0m2.029s 00:06:48.015 user 0m2.555s 00:06:48.015 sys 0m0.776s 00:06:48.015 19:20:07 rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:48.015 19:20:07 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:48.015 ************************************ 00:06:48.015 END TEST rpc 00:06:48.015 ************************************ 00:06:48.274 19:20:07 -- spdk/autotest.sh@157 -- # run_test skip_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:48.274 19:20:07 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:48.274 19:20:07 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:48.274 19:20:07 -- common/autotest_common.sh@10 -- # set +x 00:06:48.274 ************************************ 00:06:48.274 START TEST skip_rpc 00:06:48.274 ************************************ 00:06:48.274 19:20:08 skip_rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:48.274 * Looking for test storage... 00:06:48.274 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:48.274 19:20:08 skip_rpc -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:48.274 19:20:08 skip_rpc -- common/autotest_common.sh@1693 -- # lcov --version 00:06:48.274 19:20:08 skip_rpc -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:48.533 19:20:08 skip_rpc -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:48.533 19:20:08 skip_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:48.533 19:20:08 skip_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:48.533 19:20:08 skip_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:48.533 19:20:08 skip_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:48.533 19:20:08 skip_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@345 -- # : 1 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@365 -- # decimal 1 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@353 -- # local d=1 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@355 -- # echo 1 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@366 -- # decimal 2 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@353 -- # local d=2 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@355 -- # echo 2 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:48.534 19:20:08 skip_rpc -- scripts/common.sh@368 -- # return 0 00:06:48.534 19:20:08 skip_rpc -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:48.534 19:20:08 skip_rpc -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:48.534 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:48.534 --rc genhtml_branch_coverage=1 00:06:48.534 --rc genhtml_function_coverage=1 00:06:48.534 --rc genhtml_legend=1 00:06:48.534 --rc geninfo_all_blocks=1 00:06:48.534 --rc geninfo_unexecuted_blocks=1 00:06:48.534 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:48.534 ' 00:06:48.534 19:20:08 skip_rpc -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:48.534 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:48.534 --rc genhtml_branch_coverage=1 00:06:48.534 --rc genhtml_function_coverage=1 00:06:48.534 --rc genhtml_legend=1 00:06:48.534 --rc geninfo_all_blocks=1 00:06:48.534 --rc geninfo_unexecuted_blocks=1 00:06:48.534 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:48.534 ' 00:06:48.534 19:20:08 skip_rpc -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:48.534 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:48.534 --rc genhtml_branch_coverage=1 00:06:48.534 --rc genhtml_function_coverage=1 00:06:48.534 --rc genhtml_legend=1 00:06:48.534 --rc geninfo_all_blocks=1 00:06:48.534 --rc geninfo_unexecuted_blocks=1 00:06:48.534 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:48.534 ' 00:06:48.534 19:20:08 skip_rpc -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:48.534 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:48.534 --rc genhtml_branch_coverage=1 00:06:48.534 --rc genhtml_function_coverage=1 00:06:48.534 --rc genhtml_legend=1 00:06:48.534 --rc geninfo_all_blocks=1 00:06:48.534 --rc geninfo_unexecuted_blocks=1 00:06:48.534 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:48.534 ' 00:06:48.534 19:20:08 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:48.534 19:20:08 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:48.534 19:20:08 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:48.534 19:20:08 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:48.534 19:20:08 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:48.534 19:20:08 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:48.534 ************************************ 00:06:48.534 START TEST skip_rpc 00:06:48.534 ************************************ 00:06:48.534 19:20:08 skip_rpc.skip_rpc -- common/autotest_common.sh@1129 -- # test_skip_rpc 00:06:48.534 19:20:08 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:48.534 19:20:08 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=1732985 00:06:48.534 19:20:08 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:48.534 19:20:08 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:48.534 [2024-11-29 19:20:08.253162] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:06:48.534 [2024-11-29 19:20:08.253212] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1732985 ] 00:06:48.534 [2024-11-29 19:20:08.320648] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.534 [2024-11-29 19:20:08.342638] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # local es=0 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@654 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # local arg=rpc_cmd 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # type -t rpc_cmd 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@655 -- # rpc_cmd spdk_get_version 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@655 -- # es=1 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 1732985 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # '[' -z 1732985 ']' 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # kill -0 1732985 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # uname 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1732985 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1732985' 00:06:53.807 killing process with pid 1732985 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@973 -- # kill 1732985 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@978 -- # wait 1732985 00:06:53.807 00:06:53.807 real 0m5.369s 00:06:53.807 user 0m5.139s 00:06:53.807 sys 0m0.274s 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:53.807 19:20:13 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:53.807 ************************************ 00:06:53.807 END TEST skip_rpc 00:06:53.807 ************************************ 00:06:53.807 19:20:13 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:53.807 19:20:13 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:53.807 19:20:13 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:53.807 19:20:13 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:53.807 ************************************ 00:06:53.807 START TEST skip_rpc_with_json 00:06:53.807 ************************************ 00:06:53.807 19:20:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1129 -- # test_skip_rpc_with_json 00:06:53.807 19:20:13 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:53.807 19:20:13 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=1734018 00:06:53.807 19:20:13 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:53.807 19:20:13 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:53.807 19:20:13 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 1734018 00:06:53.807 19:20:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # '[' -z 1734018 ']' 00:06:53.807 19:20:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:53.807 19:20:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:53.807 19:20:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:53.807 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:53.807 19:20:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:53.807 19:20:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:54.067 [2024-11-29 19:20:13.716008] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:06:54.067 [2024-11-29 19:20:13.716083] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1734018 ] 00:06:54.067 [2024-11-29 19:20:13.788958] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.067 [2024-11-29 19:20:13.811846] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.326 19:20:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:54.326 19:20:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@868 -- # return 0 00:06:54.326 19:20:13 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:54.326 19:20:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.326 19:20:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:54.326 [2024-11-29 19:20:14.003499] nvmf_rpc.c:2706:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:54.326 request: 00:06:54.326 { 00:06:54.326 "trtype": "tcp", 00:06:54.326 "method": "nvmf_get_transports", 00:06:54.326 "req_id": 1 00:06:54.326 } 00:06:54.326 Got JSON-RPC error response 00:06:54.326 response: 00:06:54.326 { 00:06:54.326 "code": -19, 00:06:54.326 "message": "No such device" 00:06:54.326 } 00:06:54.326 19:20:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:06:54.326 19:20:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:54.326 19:20:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.326 19:20:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:54.326 [2024-11-29 19:20:14.015592] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:54.326 19:20:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.326 19:20:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:54.326 19:20:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.326 19:20:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:54.326 19:20:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.326 19:20:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:54.326 { 00:06:54.326 "subsystems": [ 00:06:54.326 { 00:06:54.326 "subsystem": "scheduler", 00:06:54.326 "config": [ 00:06:54.326 { 00:06:54.326 "method": "framework_set_scheduler", 00:06:54.326 "params": { 00:06:54.326 "name": "static" 00:06:54.326 } 00:06:54.326 } 00:06:54.326 ] 00:06:54.326 }, 00:06:54.326 { 00:06:54.326 "subsystem": "vmd", 00:06:54.326 "config": [] 00:06:54.326 }, 00:06:54.326 { 00:06:54.326 "subsystem": "sock", 00:06:54.326 "config": [ 00:06:54.326 { 00:06:54.326 "method": "sock_set_default_impl", 00:06:54.326 "params": { 00:06:54.326 "impl_name": "posix" 00:06:54.326 } 00:06:54.326 }, 00:06:54.326 { 00:06:54.326 "method": "sock_impl_set_options", 00:06:54.326 "params": { 00:06:54.326 "impl_name": "ssl", 00:06:54.326 "recv_buf_size": 4096, 00:06:54.326 "send_buf_size": 4096, 00:06:54.326 "enable_recv_pipe": true, 00:06:54.326 "enable_quickack": false, 00:06:54.326 "enable_placement_id": 0, 00:06:54.326 "enable_zerocopy_send_server": true, 00:06:54.326 "enable_zerocopy_send_client": false, 00:06:54.326 "zerocopy_threshold": 0, 00:06:54.326 "tls_version": 0, 00:06:54.326 "enable_ktls": false 00:06:54.326 } 00:06:54.326 }, 00:06:54.326 { 00:06:54.326 "method": "sock_impl_set_options", 00:06:54.326 "params": { 00:06:54.326 "impl_name": "posix", 00:06:54.326 "recv_buf_size": 2097152, 00:06:54.326 "send_buf_size": 2097152, 00:06:54.326 "enable_recv_pipe": true, 00:06:54.326 "enable_quickack": false, 00:06:54.326 "enable_placement_id": 0, 00:06:54.326 "enable_zerocopy_send_server": true, 00:06:54.326 "enable_zerocopy_send_client": false, 00:06:54.326 "zerocopy_threshold": 0, 00:06:54.326 "tls_version": 0, 00:06:54.326 "enable_ktls": false 00:06:54.326 } 00:06:54.326 } 00:06:54.326 ] 00:06:54.326 }, 00:06:54.326 { 00:06:54.326 "subsystem": "iobuf", 00:06:54.326 "config": [ 00:06:54.326 { 00:06:54.326 "method": "iobuf_set_options", 00:06:54.326 "params": { 00:06:54.326 "small_pool_count": 8192, 00:06:54.326 "large_pool_count": 1024, 00:06:54.326 "small_bufsize": 8192, 00:06:54.326 "large_bufsize": 135168, 00:06:54.326 "enable_numa": false 00:06:54.326 } 00:06:54.326 } 00:06:54.326 ] 00:06:54.326 }, 00:06:54.326 { 00:06:54.326 "subsystem": "keyring", 00:06:54.326 "config": [] 00:06:54.326 }, 00:06:54.326 { 00:06:54.326 "subsystem": "vfio_user_target", 00:06:54.326 "config": null 00:06:54.326 }, 00:06:54.326 { 00:06:54.326 "subsystem": "fsdev", 00:06:54.326 "config": [ 00:06:54.326 { 00:06:54.326 "method": "fsdev_set_opts", 00:06:54.326 "params": { 00:06:54.326 "fsdev_io_pool_size": 65535, 00:06:54.326 "fsdev_io_cache_size": 256 00:06:54.326 } 00:06:54.326 } 00:06:54.326 ] 00:06:54.326 }, 00:06:54.326 { 00:06:54.326 "subsystem": "accel", 00:06:54.326 "config": [ 00:06:54.326 { 00:06:54.326 "method": "accel_set_options", 00:06:54.326 "params": { 00:06:54.326 "small_cache_size": 128, 00:06:54.326 "large_cache_size": 16, 00:06:54.326 "task_count": 2048, 00:06:54.326 "sequence_count": 2048, 00:06:54.326 "buf_count": 2048 00:06:54.326 } 00:06:54.326 } 00:06:54.326 ] 00:06:54.326 }, 00:06:54.326 { 00:06:54.327 "subsystem": "bdev", 00:06:54.327 "config": [ 00:06:54.327 { 00:06:54.327 "method": "bdev_set_options", 00:06:54.327 "params": { 00:06:54.327 "bdev_io_pool_size": 65535, 00:06:54.327 "bdev_io_cache_size": 256, 00:06:54.327 "bdev_auto_examine": true, 00:06:54.327 "iobuf_small_cache_size": 128, 00:06:54.327 "iobuf_large_cache_size": 16 00:06:54.327 } 00:06:54.327 }, 00:06:54.327 { 00:06:54.327 "method": "bdev_raid_set_options", 00:06:54.327 "params": { 00:06:54.327 "process_window_size_kb": 1024, 00:06:54.327 "process_max_bandwidth_mb_sec": 0 00:06:54.327 } 00:06:54.327 }, 00:06:54.327 { 00:06:54.327 "method": "bdev_nvme_set_options", 00:06:54.327 "params": { 00:06:54.327 "action_on_timeout": "none", 00:06:54.327 "timeout_us": 0, 00:06:54.327 "timeout_admin_us": 0, 00:06:54.327 "keep_alive_timeout_ms": 10000, 00:06:54.327 "arbitration_burst": 0, 00:06:54.327 "low_priority_weight": 0, 00:06:54.327 "medium_priority_weight": 0, 00:06:54.327 "high_priority_weight": 0, 00:06:54.327 "nvme_adminq_poll_period_us": 10000, 00:06:54.327 "nvme_ioq_poll_period_us": 0, 00:06:54.327 "io_queue_requests": 0, 00:06:54.327 "delay_cmd_submit": true, 00:06:54.327 "transport_retry_count": 4, 00:06:54.327 "bdev_retry_count": 3, 00:06:54.327 "transport_ack_timeout": 0, 00:06:54.327 "ctrlr_loss_timeout_sec": 0, 00:06:54.327 "reconnect_delay_sec": 0, 00:06:54.327 "fast_io_fail_timeout_sec": 0, 00:06:54.327 "disable_auto_failback": false, 00:06:54.327 "generate_uuids": false, 00:06:54.327 "transport_tos": 0, 00:06:54.327 "nvme_error_stat": false, 00:06:54.327 "rdma_srq_size": 0, 00:06:54.327 "io_path_stat": false, 00:06:54.327 "allow_accel_sequence": false, 00:06:54.327 "rdma_max_cq_size": 0, 00:06:54.327 "rdma_cm_event_timeout_ms": 0, 00:06:54.327 "dhchap_digests": [ 00:06:54.327 "sha256", 00:06:54.327 "sha384", 00:06:54.327 "sha512" 00:06:54.327 ], 00:06:54.327 "dhchap_dhgroups": [ 00:06:54.327 "null", 00:06:54.327 "ffdhe2048", 00:06:54.327 "ffdhe3072", 00:06:54.327 "ffdhe4096", 00:06:54.327 "ffdhe6144", 00:06:54.327 "ffdhe8192" 00:06:54.327 ] 00:06:54.327 } 00:06:54.327 }, 00:06:54.327 { 00:06:54.327 "method": "bdev_nvme_set_hotplug", 00:06:54.327 "params": { 00:06:54.327 "period_us": 100000, 00:06:54.327 "enable": false 00:06:54.327 } 00:06:54.327 }, 00:06:54.327 { 00:06:54.327 "method": "bdev_iscsi_set_options", 00:06:54.327 "params": { 00:06:54.327 "timeout_sec": 30 00:06:54.327 } 00:06:54.327 }, 00:06:54.327 { 00:06:54.327 "method": "bdev_wait_for_examine" 00:06:54.327 } 00:06:54.327 ] 00:06:54.327 }, 00:06:54.327 { 00:06:54.327 "subsystem": "nvmf", 00:06:54.327 "config": [ 00:06:54.327 { 00:06:54.327 "method": "nvmf_set_config", 00:06:54.327 "params": { 00:06:54.327 "discovery_filter": "match_any", 00:06:54.327 "admin_cmd_passthru": { 00:06:54.327 "identify_ctrlr": false 00:06:54.327 }, 00:06:54.327 "dhchap_digests": [ 00:06:54.327 "sha256", 00:06:54.327 "sha384", 00:06:54.327 "sha512" 00:06:54.327 ], 00:06:54.327 "dhchap_dhgroups": [ 00:06:54.327 "null", 00:06:54.327 "ffdhe2048", 00:06:54.327 "ffdhe3072", 00:06:54.327 "ffdhe4096", 00:06:54.327 "ffdhe6144", 00:06:54.327 "ffdhe8192" 00:06:54.327 ] 00:06:54.327 } 00:06:54.327 }, 00:06:54.327 { 00:06:54.327 "method": "nvmf_set_max_subsystems", 00:06:54.327 "params": { 00:06:54.327 "max_subsystems": 1024 00:06:54.327 } 00:06:54.327 }, 00:06:54.327 { 00:06:54.327 "method": "nvmf_set_crdt", 00:06:54.327 "params": { 00:06:54.327 "crdt1": 0, 00:06:54.327 "crdt2": 0, 00:06:54.327 "crdt3": 0 00:06:54.327 } 00:06:54.327 }, 00:06:54.327 { 00:06:54.327 "method": "nvmf_create_transport", 00:06:54.327 "params": { 00:06:54.327 "trtype": "TCP", 00:06:54.327 "max_queue_depth": 128, 00:06:54.327 "max_io_qpairs_per_ctrlr": 127, 00:06:54.327 "in_capsule_data_size": 4096, 00:06:54.327 "max_io_size": 131072, 00:06:54.327 "io_unit_size": 131072, 00:06:54.327 "max_aq_depth": 128, 00:06:54.327 "num_shared_buffers": 511, 00:06:54.327 "buf_cache_size": 4294967295, 00:06:54.327 "dif_insert_or_strip": false, 00:06:54.327 "zcopy": false, 00:06:54.327 "c2h_success": true, 00:06:54.327 "sock_priority": 0, 00:06:54.327 "abort_timeout_sec": 1, 00:06:54.327 "ack_timeout": 0, 00:06:54.327 "data_wr_pool_size": 0 00:06:54.327 } 00:06:54.327 } 00:06:54.327 ] 00:06:54.327 }, 00:06:54.327 { 00:06:54.327 "subsystem": "nbd", 00:06:54.327 "config": [] 00:06:54.327 }, 00:06:54.327 { 00:06:54.327 "subsystem": "ublk", 00:06:54.327 "config": [] 00:06:54.327 }, 00:06:54.327 { 00:06:54.327 "subsystem": "vhost_blk", 00:06:54.327 "config": [] 00:06:54.327 }, 00:06:54.327 { 00:06:54.327 "subsystem": "scsi", 00:06:54.327 "config": null 00:06:54.327 }, 00:06:54.327 { 00:06:54.327 "subsystem": "iscsi", 00:06:54.327 "config": [ 00:06:54.327 { 00:06:54.327 "method": "iscsi_set_options", 00:06:54.327 "params": { 00:06:54.327 "node_base": "iqn.2016-06.io.spdk", 00:06:54.327 "max_sessions": 128, 00:06:54.327 "max_connections_per_session": 2, 00:06:54.327 "max_queue_depth": 64, 00:06:54.327 "default_time2wait": 2, 00:06:54.327 "default_time2retain": 20, 00:06:54.327 "first_burst_length": 8192, 00:06:54.327 "immediate_data": true, 00:06:54.327 "allow_duplicated_isid": false, 00:06:54.327 "error_recovery_level": 0, 00:06:54.327 "nop_timeout": 60, 00:06:54.327 "nop_in_interval": 30, 00:06:54.327 "disable_chap": false, 00:06:54.327 "require_chap": false, 00:06:54.327 "mutual_chap": false, 00:06:54.327 "chap_group": 0, 00:06:54.327 "max_large_datain_per_connection": 64, 00:06:54.327 "max_r2t_per_connection": 4, 00:06:54.327 "pdu_pool_size": 36864, 00:06:54.327 "immediate_data_pool_size": 16384, 00:06:54.327 "data_out_pool_size": 2048 00:06:54.327 } 00:06:54.327 } 00:06:54.327 ] 00:06:54.327 }, 00:06:54.327 { 00:06:54.327 "subsystem": "vhost_scsi", 00:06:54.327 "config": [] 00:06:54.327 } 00:06:54.327 ] 00:06:54.327 } 00:06:54.327 19:20:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:54.327 19:20:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 1734018 00:06:54.327 19:20:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' -z 1734018 ']' 00:06:54.327 19:20:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # kill -0 1734018 00:06:54.327 19:20:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # uname 00:06:54.328 19:20:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:54.328 19:20:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1734018 00:06:54.586 19:20:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:54.586 19:20:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:54.586 19:20:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1734018' 00:06:54.586 killing process with pid 1734018 00:06:54.586 19:20:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # kill 1734018 00:06:54.586 19:20:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@978 -- # wait 1734018 00:06:54.845 19:20:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=1734090 00:06:54.845 19:20:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:54.845 19:20:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 1734090 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' -z 1734090 ']' 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # kill -0 1734090 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # uname 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1734090 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1734090' 00:07:00.109 killing process with pid 1734090 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # kill 1734090 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@978 -- # wait 1734090 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:07:00.109 00:07:00.109 real 0m6.225s 00:07:00.109 user 0m5.907s 00:07:00.109 sys 0m0.654s 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:00.109 ************************************ 00:07:00.109 END TEST skip_rpc_with_json 00:07:00.109 ************************************ 00:07:00.109 19:20:19 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:07:00.109 19:20:19 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:00.109 19:20:19 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:00.109 19:20:19 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:00.109 ************************************ 00:07:00.109 START TEST skip_rpc_with_delay 00:07:00.109 ************************************ 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1129 -- # test_skip_rpc_with_delay 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # local es=0 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:07:00.109 19:20:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:00.367 [2024-11-29 19:20:20.024507] app.c: 842:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:07:00.367 19:20:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@655 -- # es=1 00:07:00.367 19:20:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:00.367 19:20:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:00.367 19:20:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:00.367 00:07:00.367 real 0m0.048s 00:07:00.367 user 0m0.019s 00:07:00.367 sys 0m0.029s 00:07:00.367 19:20:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:00.367 19:20:20 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:07:00.367 ************************************ 00:07:00.367 END TEST skip_rpc_with_delay 00:07:00.367 ************************************ 00:07:00.367 19:20:20 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:07:00.367 19:20:20 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:07:00.367 19:20:20 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:07:00.367 19:20:20 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:00.367 19:20:20 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:00.367 19:20:20 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:00.367 ************************************ 00:07:00.367 START TEST exit_on_failed_rpc_init 00:07:00.368 ************************************ 00:07:00.368 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1129 -- # test_exit_on_failed_rpc_init 00:07:00.368 19:20:20 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=1735199 00:07:00.368 19:20:20 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 1735199 00:07:00.368 19:20:20 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:00.368 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # '[' -z 1735199 ']' 00:07:00.368 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:00.368 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:00.368 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:00.368 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:00.368 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:00.368 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:07:00.368 [2024-11-29 19:20:20.159146] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:00.368 [2024-11-29 19:20:20.159208] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1735199 ] 00:07:00.368 [2024-11-29 19:20:20.231072] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.368 [2024-11-29 19:20:20.253913] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.627 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:00.627 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@868 -- # return 0 00:07:00.627 19:20:20 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:00.627 19:20:20 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:00.627 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # local es=0 00:07:00.627 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:00.627 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:00.627 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:00.627 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:00.627 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:00.627 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:00.627 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:00.627 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:00.627 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:07:00.627 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:00.627 [2024-11-29 19:20:20.481461] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:00.627 [2024-11-29 19:20:20.481520] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1735205 ] 00:07:00.887 [2024-11-29 19:20:20.549354] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.887 [2024-11-29 19:20:20.571747] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:00.887 [2024-11-29 19:20:20.571854] rpc.c: 181:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:07:00.887 [2024-11-29 19:20:20.571868] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:07:00.887 [2024-11-29 19:20:20.571876] app.c:1064:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:00.887 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@655 -- # es=234 00:07:00.887 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:00.887 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@664 -- # es=106 00:07:00.887 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@665 -- # case "$es" in 00:07:00.887 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@672 -- # es=1 00:07:00.887 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:00.887 19:20:20 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:07:00.887 19:20:20 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 1735199 00:07:00.887 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # '[' -z 1735199 ']' 00:07:00.887 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # kill -0 1735199 00:07:00.887 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # uname 00:07:00.887 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:00.887 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1735199 00:07:00.887 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:00.887 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:00.887 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1735199' 00:07:00.887 killing process with pid 1735199 00:07:00.887 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@973 -- # kill 1735199 00:07:00.887 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@978 -- # wait 1735199 00:07:01.146 00:07:01.146 real 0m0.828s 00:07:01.146 user 0m0.808s 00:07:01.146 sys 0m0.409s 00:07:01.146 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:01.146 19:20:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:07:01.146 ************************************ 00:07:01.146 END TEST exit_on_failed_rpc_init 00:07:01.146 ************************************ 00:07:01.146 19:20:20 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:07:01.146 00:07:01.146 real 0m13.004s 00:07:01.146 user 0m12.090s 00:07:01.146 sys 0m1.726s 00:07:01.146 19:20:21 skip_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:01.147 19:20:21 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:01.147 ************************************ 00:07:01.147 END TEST skip_rpc 00:07:01.147 ************************************ 00:07:01.147 19:20:21 -- spdk/autotest.sh@158 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:01.147 19:20:21 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:01.147 19:20:21 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:01.147 19:20:21 -- common/autotest_common.sh@10 -- # set +x 00:07:01.406 ************************************ 00:07:01.406 START TEST rpc_client 00:07:01.406 ************************************ 00:07:01.406 19:20:21 rpc_client -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:01.406 * Looking for test storage... 00:07:01.406 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:07:01.406 19:20:21 rpc_client -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:01.406 19:20:21 rpc_client -- common/autotest_common.sh@1693 -- # lcov --version 00:07:01.406 19:20:21 rpc_client -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:01.406 19:20:21 rpc_client -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:01.406 19:20:21 rpc_client -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:01.406 19:20:21 rpc_client -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:01.406 19:20:21 rpc_client -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:01.406 19:20:21 rpc_client -- scripts/common.sh@336 -- # IFS=.-: 00:07:01.406 19:20:21 rpc_client -- scripts/common.sh@336 -- # read -ra ver1 00:07:01.406 19:20:21 rpc_client -- scripts/common.sh@337 -- # IFS=.-: 00:07:01.406 19:20:21 rpc_client -- scripts/common.sh@337 -- # read -ra ver2 00:07:01.406 19:20:21 rpc_client -- scripts/common.sh@338 -- # local 'op=<' 00:07:01.406 19:20:21 rpc_client -- scripts/common.sh@340 -- # ver1_l=2 00:07:01.406 19:20:21 rpc_client -- scripts/common.sh@341 -- # ver2_l=1 00:07:01.406 19:20:21 rpc_client -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:01.406 19:20:21 rpc_client -- scripts/common.sh@344 -- # case "$op" in 00:07:01.406 19:20:21 rpc_client -- scripts/common.sh@345 -- # : 1 00:07:01.406 19:20:21 rpc_client -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:01.407 19:20:21 rpc_client -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:01.407 19:20:21 rpc_client -- scripts/common.sh@365 -- # decimal 1 00:07:01.407 19:20:21 rpc_client -- scripts/common.sh@353 -- # local d=1 00:07:01.407 19:20:21 rpc_client -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:01.407 19:20:21 rpc_client -- scripts/common.sh@355 -- # echo 1 00:07:01.407 19:20:21 rpc_client -- scripts/common.sh@365 -- # ver1[v]=1 00:07:01.407 19:20:21 rpc_client -- scripts/common.sh@366 -- # decimal 2 00:07:01.407 19:20:21 rpc_client -- scripts/common.sh@353 -- # local d=2 00:07:01.407 19:20:21 rpc_client -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:01.407 19:20:21 rpc_client -- scripts/common.sh@355 -- # echo 2 00:07:01.407 19:20:21 rpc_client -- scripts/common.sh@366 -- # ver2[v]=2 00:07:01.407 19:20:21 rpc_client -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:01.407 19:20:21 rpc_client -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:01.407 19:20:21 rpc_client -- scripts/common.sh@368 -- # return 0 00:07:01.407 19:20:21 rpc_client -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:01.407 19:20:21 rpc_client -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:01.407 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:01.407 --rc genhtml_branch_coverage=1 00:07:01.407 --rc genhtml_function_coverage=1 00:07:01.407 --rc genhtml_legend=1 00:07:01.407 --rc geninfo_all_blocks=1 00:07:01.407 --rc geninfo_unexecuted_blocks=1 00:07:01.407 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:01.407 ' 00:07:01.407 19:20:21 rpc_client -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:01.407 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:01.407 --rc genhtml_branch_coverage=1 00:07:01.407 --rc genhtml_function_coverage=1 00:07:01.407 --rc genhtml_legend=1 00:07:01.407 --rc geninfo_all_blocks=1 00:07:01.407 --rc geninfo_unexecuted_blocks=1 00:07:01.407 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:01.407 ' 00:07:01.407 19:20:21 rpc_client -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:01.407 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:01.407 --rc genhtml_branch_coverage=1 00:07:01.407 --rc genhtml_function_coverage=1 00:07:01.407 --rc genhtml_legend=1 00:07:01.407 --rc geninfo_all_blocks=1 00:07:01.407 --rc geninfo_unexecuted_blocks=1 00:07:01.407 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:01.407 ' 00:07:01.407 19:20:21 rpc_client -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:01.407 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:01.407 --rc genhtml_branch_coverage=1 00:07:01.407 --rc genhtml_function_coverage=1 00:07:01.407 --rc genhtml_legend=1 00:07:01.407 --rc geninfo_all_blocks=1 00:07:01.407 --rc geninfo_unexecuted_blocks=1 00:07:01.407 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:01.407 ' 00:07:01.407 19:20:21 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:07:01.407 OK 00:07:01.407 19:20:21 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:07:01.407 00:07:01.407 real 0m0.210s 00:07:01.407 user 0m0.115s 00:07:01.407 sys 0m0.111s 00:07:01.407 19:20:21 rpc_client -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:01.407 19:20:21 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:07:01.407 ************************************ 00:07:01.407 END TEST rpc_client 00:07:01.407 ************************************ 00:07:01.667 19:20:21 -- spdk/autotest.sh@159 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:07:01.667 19:20:21 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:01.667 19:20:21 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:01.667 19:20:21 -- common/autotest_common.sh@10 -- # set +x 00:07:01.667 ************************************ 00:07:01.667 START TEST json_config 00:07:01.667 ************************************ 00:07:01.667 19:20:21 json_config -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:07:01.667 19:20:21 json_config -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:01.667 19:20:21 json_config -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:01.667 19:20:21 json_config -- common/autotest_common.sh@1693 -- # lcov --version 00:07:01.667 19:20:21 json_config -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:01.667 19:20:21 json_config -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:01.667 19:20:21 json_config -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:01.667 19:20:21 json_config -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:01.667 19:20:21 json_config -- scripts/common.sh@336 -- # IFS=.-: 00:07:01.667 19:20:21 json_config -- scripts/common.sh@336 -- # read -ra ver1 00:07:01.667 19:20:21 json_config -- scripts/common.sh@337 -- # IFS=.-: 00:07:01.667 19:20:21 json_config -- scripts/common.sh@337 -- # read -ra ver2 00:07:01.667 19:20:21 json_config -- scripts/common.sh@338 -- # local 'op=<' 00:07:01.667 19:20:21 json_config -- scripts/common.sh@340 -- # ver1_l=2 00:07:01.667 19:20:21 json_config -- scripts/common.sh@341 -- # ver2_l=1 00:07:01.667 19:20:21 json_config -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:01.667 19:20:21 json_config -- scripts/common.sh@344 -- # case "$op" in 00:07:01.667 19:20:21 json_config -- scripts/common.sh@345 -- # : 1 00:07:01.667 19:20:21 json_config -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:01.667 19:20:21 json_config -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:01.667 19:20:21 json_config -- scripts/common.sh@365 -- # decimal 1 00:07:01.667 19:20:21 json_config -- scripts/common.sh@353 -- # local d=1 00:07:01.667 19:20:21 json_config -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:01.667 19:20:21 json_config -- scripts/common.sh@355 -- # echo 1 00:07:01.667 19:20:21 json_config -- scripts/common.sh@365 -- # ver1[v]=1 00:07:01.667 19:20:21 json_config -- scripts/common.sh@366 -- # decimal 2 00:07:01.667 19:20:21 json_config -- scripts/common.sh@353 -- # local d=2 00:07:01.667 19:20:21 json_config -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:01.667 19:20:21 json_config -- scripts/common.sh@355 -- # echo 2 00:07:01.667 19:20:21 json_config -- scripts/common.sh@366 -- # ver2[v]=2 00:07:01.667 19:20:21 json_config -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:01.667 19:20:21 json_config -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:01.667 19:20:21 json_config -- scripts/common.sh@368 -- # return 0 00:07:01.667 19:20:21 json_config -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:01.667 19:20:21 json_config -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:01.667 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:01.667 --rc genhtml_branch_coverage=1 00:07:01.667 --rc genhtml_function_coverage=1 00:07:01.667 --rc genhtml_legend=1 00:07:01.667 --rc geninfo_all_blocks=1 00:07:01.667 --rc geninfo_unexecuted_blocks=1 00:07:01.667 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:01.667 ' 00:07:01.667 19:20:21 json_config -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:01.667 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:01.667 --rc genhtml_branch_coverage=1 00:07:01.667 --rc genhtml_function_coverage=1 00:07:01.667 --rc genhtml_legend=1 00:07:01.667 --rc geninfo_all_blocks=1 00:07:01.667 --rc geninfo_unexecuted_blocks=1 00:07:01.667 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:01.667 ' 00:07:01.667 19:20:21 json_config -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:01.667 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:01.667 --rc genhtml_branch_coverage=1 00:07:01.667 --rc genhtml_function_coverage=1 00:07:01.667 --rc genhtml_legend=1 00:07:01.667 --rc geninfo_all_blocks=1 00:07:01.667 --rc geninfo_unexecuted_blocks=1 00:07:01.667 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:01.667 ' 00:07:01.667 19:20:21 json_config -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:01.667 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:01.667 --rc genhtml_branch_coverage=1 00:07:01.667 --rc genhtml_function_coverage=1 00:07:01.667 --rc genhtml_legend=1 00:07:01.667 --rc geninfo_all_blocks=1 00:07:01.667 --rc geninfo_unexecuted_blocks=1 00:07:01.667 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:01.667 ' 00:07:01.667 19:20:21 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:07:01.667 19:20:21 json_config -- nvmf/common.sh@7 -- # uname -s 00:07:01.667 19:20:21 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:01.667 19:20:21 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:01.667 19:20:21 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:01.667 19:20:21 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:01.667 19:20:21 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:01.667 19:20:21 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:01.667 19:20:21 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:01.667 19:20:21 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:01.667 19:20:21 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:01.667 19:20:21 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:01.667 19:20:21 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:07:01.667 19:20:21 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:07:01.667 19:20:21 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:01.667 19:20:21 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:01.928 19:20:21 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:01.928 19:20:21 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:01.928 19:20:21 json_config -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:01.928 19:20:21 json_config -- scripts/common.sh@15 -- # shopt -s extglob 00:07:01.928 19:20:21 json_config -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:01.928 19:20:21 json_config -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:01.928 19:20:21 json_config -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:01.928 19:20:21 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:01.928 19:20:21 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:01.928 19:20:21 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:01.928 19:20:21 json_config -- paths/export.sh@5 -- # export PATH 00:07:01.928 19:20:21 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:01.928 19:20:21 json_config -- nvmf/common.sh@51 -- # : 0 00:07:01.928 19:20:21 json_config -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:07:01.928 19:20:21 json_config -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:07:01.928 19:20:21 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:01.928 19:20:21 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:01.928 19:20:21 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:01.928 19:20:21 json_config -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:07:01.928 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:07:01.928 19:20:21 json_config -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:07:01.928 19:20:21 json_config -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:07:01.928 19:20:21 json_config -- nvmf/common.sh@55 -- # have_pci_nics=0 00:07:01.928 19:20:21 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:07:01.928 19:20:21 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:07:01.928 19:20:21 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:07:01.928 19:20:21 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:07:01.928 19:20:21 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:07:01.928 19:20:21 json_config -- json_config/json_config.sh@27 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:07:01.928 WARNING: No tests are enabled so not running JSON configuration tests 00:07:01.928 19:20:21 json_config -- json_config/json_config.sh@28 -- # exit 0 00:07:01.928 00:07:01.928 real 0m0.205s 00:07:01.928 user 0m0.120s 00:07:01.928 sys 0m0.090s 00:07:01.928 19:20:21 json_config -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:01.928 19:20:21 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:01.928 ************************************ 00:07:01.928 END TEST json_config 00:07:01.928 ************************************ 00:07:01.928 19:20:21 -- spdk/autotest.sh@160 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:01.928 19:20:21 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:01.928 19:20:21 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:01.928 19:20:21 -- common/autotest_common.sh@10 -- # set +x 00:07:01.928 ************************************ 00:07:01.928 START TEST json_config_extra_key 00:07:01.928 ************************************ 00:07:01.928 19:20:21 json_config_extra_key -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:01.928 19:20:21 json_config_extra_key -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:01.928 19:20:21 json_config_extra_key -- common/autotest_common.sh@1693 -- # lcov --version 00:07:01.928 19:20:21 json_config_extra_key -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:01.928 19:20:21 json_config_extra_key -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:01.928 19:20:21 json_config_extra_key -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:01.928 19:20:21 json_config_extra_key -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:01.928 19:20:21 json_config_extra_key -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:01.928 19:20:21 json_config_extra_key -- scripts/common.sh@336 -- # IFS=.-: 00:07:01.928 19:20:21 json_config_extra_key -- scripts/common.sh@336 -- # read -ra ver1 00:07:01.928 19:20:21 json_config_extra_key -- scripts/common.sh@337 -- # IFS=.-: 00:07:01.928 19:20:21 json_config_extra_key -- scripts/common.sh@337 -- # read -ra ver2 00:07:01.928 19:20:21 json_config_extra_key -- scripts/common.sh@338 -- # local 'op=<' 00:07:01.928 19:20:21 json_config_extra_key -- scripts/common.sh@340 -- # ver1_l=2 00:07:01.928 19:20:21 json_config_extra_key -- scripts/common.sh@341 -- # ver2_l=1 00:07:01.929 19:20:21 json_config_extra_key -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:01.929 19:20:21 json_config_extra_key -- scripts/common.sh@344 -- # case "$op" in 00:07:01.929 19:20:21 json_config_extra_key -- scripts/common.sh@345 -- # : 1 00:07:01.929 19:20:21 json_config_extra_key -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:01.929 19:20:21 json_config_extra_key -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:01.929 19:20:21 json_config_extra_key -- scripts/common.sh@365 -- # decimal 1 00:07:01.929 19:20:21 json_config_extra_key -- scripts/common.sh@353 -- # local d=1 00:07:01.929 19:20:21 json_config_extra_key -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:01.929 19:20:21 json_config_extra_key -- scripts/common.sh@355 -- # echo 1 00:07:01.929 19:20:21 json_config_extra_key -- scripts/common.sh@365 -- # ver1[v]=1 00:07:01.929 19:20:21 json_config_extra_key -- scripts/common.sh@366 -- # decimal 2 00:07:01.929 19:20:21 json_config_extra_key -- scripts/common.sh@353 -- # local d=2 00:07:01.929 19:20:21 json_config_extra_key -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:01.929 19:20:21 json_config_extra_key -- scripts/common.sh@355 -- # echo 2 00:07:01.929 19:20:21 json_config_extra_key -- scripts/common.sh@366 -- # ver2[v]=2 00:07:01.929 19:20:21 json_config_extra_key -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:01.929 19:20:21 json_config_extra_key -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:01.929 19:20:21 json_config_extra_key -- scripts/common.sh@368 -- # return 0 00:07:01.929 19:20:21 json_config_extra_key -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:01.929 19:20:21 json_config_extra_key -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:01.929 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:01.929 --rc genhtml_branch_coverage=1 00:07:01.929 --rc genhtml_function_coverage=1 00:07:01.929 --rc genhtml_legend=1 00:07:01.929 --rc geninfo_all_blocks=1 00:07:01.929 --rc geninfo_unexecuted_blocks=1 00:07:01.929 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:01.929 ' 00:07:01.929 19:20:21 json_config_extra_key -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:01.929 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:01.929 --rc genhtml_branch_coverage=1 00:07:01.929 --rc genhtml_function_coverage=1 00:07:01.929 --rc genhtml_legend=1 00:07:01.929 --rc geninfo_all_blocks=1 00:07:01.929 --rc geninfo_unexecuted_blocks=1 00:07:01.929 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:01.929 ' 00:07:01.929 19:20:21 json_config_extra_key -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:01.929 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:01.929 --rc genhtml_branch_coverage=1 00:07:01.929 --rc genhtml_function_coverage=1 00:07:01.929 --rc genhtml_legend=1 00:07:01.929 --rc geninfo_all_blocks=1 00:07:01.929 --rc geninfo_unexecuted_blocks=1 00:07:01.929 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:01.929 ' 00:07:01.929 19:20:21 json_config_extra_key -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:01.929 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:01.929 --rc genhtml_branch_coverage=1 00:07:01.929 --rc genhtml_function_coverage=1 00:07:01.929 --rc genhtml_legend=1 00:07:01.929 --rc geninfo_all_blocks=1 00:07:01.929 --rc geninfo_unexecuted_blocks=1 00:07:01.929 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:01.929 ' 00:07:01.929 19:20:21 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:07:01.929 19:20:21 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:07:01.929 19:20:21 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:01.929 19:20:21 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:01.929 19:20:21 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:01.929 19:20:21 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:01.929 19:20:21 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:01.929 19:20:21 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:01.929 19:20:21 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:01.929 19:20:21 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:01.929 19:20:21 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:01.929 19:20:21 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:02.189 19:20:21 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:07:02.189 19:20:21 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:07:02.189 19:20:21 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:02.189 19:20:21 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:02.189 19:20:21 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:02.189 19:20:21 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:02.189 19:20:21 json_config_extra_key -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:02.189 19:20:21 json_config_extra_key -- scripts/common.sh@15 -- # shopt -s extglob 00:07:02.189 19:20:21 json_config_extra_key -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:02.189 19:20:21 json_config_extra_key -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:02.189 19:20:21 json_config_extra_key -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:02.189 19:20:21 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:02.189 19:20:21 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:02.189 19:20:21 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:02.189 19:20:21 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:07:02.189 19:20:21 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:02.189 19:20:21 json_config_extra_key -- nvmf/common.sh@51 -- # : 0 00:07:02.189 19:20:21 json_config_extra_key -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:07:02.189 19:20:21 json_config_extra_key -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:07:02.189 19:20:21 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:02.189 19:20:21 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:02.189 19:20:21 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:02.189 19:20:21 json_config_extra_key -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:07:02.189 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:07:02.189 19:20:21 json_config_extra_key -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:07:02.189 19:20:21 json_config_extra_key -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:07:02.189 19:20:21 json_config_extra_key -- nvmf/common.sh@55 -- # have_pci_nics=0 00:07:02.189 19:20:21 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:07:02.189 19:20:21 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:07:02.189 19:20:21 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:07:02.189 19:20:21 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:07:02.189 19:20:21 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:07:02.189 19:20:21 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:07:02.189 19:20:21 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:07:02.189 19:20:21 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:07:02.189 19:20:21 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:07:02.189 19:20:21 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:02.189 19:20:21 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:07:02.189 INFO: launching applications... 00:07:02.189 19:20:21 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:07:02.189 19:20:21 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:07:02.189 19:20:21 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:07:02.189 19:20:21 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:02.189 19:20:21 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:02.189 19:20:21 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:07:02.189 19:20:21 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:02.189 19:20:21 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:02.189 19:20:21 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=1735639 00:07:02.189 19:20:21 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:02.189 Waiting for target to run... 00:07:02.189 19:20:21 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 1735639 /var/tmp/spdk_tgt.sock 00:07:02.189 19:20:21 json_config_extra_key -- common/autotest_common.sh@835 -- # '[' -z 1735639 ']' 00:07:02.190 19:20:21 json_config_extra_key -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:02.190 19:20:21 json_config_extra_key -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:02.190 19:20:21 json_config_extra_key -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:02.190 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:02.190 19:20:21 json_config_extra_key -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:02.190 19:20:21 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:07:02.190 19:20:21 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:02.190 [2024-11-29 19:20:21.869977] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:02.190 [2024-11-29 19:20:21.870068] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1735639 ] 00:07:02.449 [2024-11-29 19:20:22.149471] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.449 [2024-11-29 19:20:22.161884] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.017 19:20:22 json_config_extra_key -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:03.017 19:20:22 json_config_extra_key -- common/autotest_common.sh@868 -- # return 0 00:07:03.017 19:20:22 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:07:03.017 00:07:03.017 19:20:22 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:07:03.017 INFO: shutting down applications... 00:07:03.017 19:20:22 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:07:03.017 19:20:22 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:07:03.017 19:20:22 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:03.017 19:20:22 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 1735639 ]] 00:07:03.017 19:20:22 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 1735639 00:07:03.017 19:20:22 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:03.017 19:20:22 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:03.017 19:20:22 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1735639 00:07:03.017 19:20:22 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:07:03.586 19:20:23 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:07:03.586 19:20:23 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:03.586 19:20:23 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1735639 00:07:03.586 19:20:23 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:03.586 19:20:23 json_config_extra_key -- json_config/common.sh@43 -- # break 00:07:03.586 19:20:23 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:03.586 19:20:23 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:03.586 SPDK target shutdown done 00:07:03.586 19:20:23 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:07:03.586 Success 00:07:03.586 00:07:03.586 real 0m1.547s 00:07:03.586 user 0m1.276s 00:07:03.586 sys 0m0.422s 00:07:03.586 19:20:23 json_config_extra_key -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:03.586 19:20:23 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:03.586 ************************************ 00:07:03.586 END TEST json_config_extra_key 00:07:03.586 ************************************ 00:07:03.586 19:20:23 -- spdk/autotest.sh@161 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:03.586 19:20:23 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:03.586 19:20:23 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:03.586 19:20:23 -- common/autotest_common.sh@10 -- # set +x 00:07:03.586 ************************************ 00:07:03.586 START TEST alias_rpc 00:07:03.586 ************************************ 00:07:03.586 19:20:23 alias_rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:03.586 * Looking for test storage... 00:07:03.586 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:07:03.586 19:20:23 alias_rpc -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:03.586 19:20:23 alias_rpc -- common/autotest_common.sh@1693 -- # lcov --version 00:07:03.586 19:20:23 alias_rpc -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:03.586 19:20:23 alias_rpc -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@344 -- # case "$op" in 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@345 -- # : 1 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@365 -- # decimal 1 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@353 -- # local d=1 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@355 -- # echo 1 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@366 -- # decimal 2 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@353 -- # local d=2 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@355 -- # echo 2 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:03.586 19:20:23 alias_rpc -- scripts/common.sh@368 -- # return 0 00:07:03.586 19:20:23 alias_rpc -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:03.586 19:20:23 alias_rpc -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:03.586 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:03.586 --rc genhtml_branch_coverage=1 00:07:03.586 --rc genhtml_function_coverage=1 00:07:03.586 --rc genhtml_legend=1 00:07:03.586 --rc geninfo_all_blocks=1 00:07:03.586 --rc geninfo_unexecuted_blocks=1 00:07:03.586 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:03.586 ' 00:07:03.586 19:20:23 alias_rpc -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:03.586 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:03.586 --rc genhtml_branch_coverage=1 00:07:03.586 --rc genhtml_function_coverage=1 00:07:03.586 --rc genhtml_legend=1 00:07:03.586 --rc geninfo_all_blocks=1 00:07:03.586 --rc geninfo_unexecuted_blocks=1 00:07:03.586 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:03.586 ' 00:07:03.586 19:20:23 alias_rpc -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:03.586 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:03.586 --rc genhtml_branch_coverage=1 00:07:03.586 --rc genhtml_function_coverage=1 00:07:03.586 --rc genhtml_legend=1 00:07:03.586 --rc geninfo_all_blocks=1 00:07:03.586 --rc geninfo_unexecuted_blocks=1 00:07:03.586 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:03.586 ' 00:07:03.586 19:20:23 alias_rpc -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:03.586 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:03.586 --rc genhtml_branch_coverage=1 00:07:03.586 --rc genhtml_function_coverage=1 00:07:03.586 --rc genhtml_legend=1 00:07:03.586 --rc geninfo_all_blocks=1 00:07:03.586 --rc geninfo_unexecuted_blocks=1 00:07:03.586 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:03.586 ' 00:07:03.586 19:20:23 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:03.845 19:20:23 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1735967 00:07:03.845 19:20:23 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1735967 00:07:03.845 19:20:23 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:03.845 19:20:23 alias_rpc -- common/autotest_common.sh@835 -- # '[' -z 1735967 ']' 00:07:03.845 19:20:23 alias_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:03.845 19:20:23 alias_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:03.845 19:20:23 alias_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:03.845 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:03.845 19:20:23 alias_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:03.845 19:20:23 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:03.845 [2024-11-29 19:20:23.517317] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:03.845 [2024-11-29 19:20:23.517382] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1735967 ] 00:07:03.845 [2024-11-29 19:20:23.587743] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.845 [2024-11-29 19:20:23.610368] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.104 19:20:23 alias_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:04.104 19:20:23 alias_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:04.104 19:20:23 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:07:04.104 19:20:24 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1735967 00:07:04.104 19:20:24 alias_rpc -- common/autotest_common.sh@954 -- # '[' -z 1735967 ']' 00:07:04.104 19:20:24 alias_rpc -- common/autotest_common.sh@958 -- # kill -0 1735967 00:07:04.104 19:20:24 alias_rpc -- common/autotest_common.sh@959 -- # uname 00:07:04.363 19:20:24 alias_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:04.363 19:20:24 alias_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1735967 00:07:04.363 19:20:24 alias_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:04.363 19:20:24 alias_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:04.363 19:20:24 alias_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1735967' 00:07:04.363 killing process with pid 1735967 00:07:04.363 19:20:24 alias_rpc -- common/autotest_common.sh@973 -- # kill 1735967 00:07:04.363 19:20:24 alias_rpc -- common/autotest_common.sh@978 -- # wait 1735967 00:07:04.623 00:07:04.623 real 0m1.064s 00:07:04.623 user 0m1.032s 00:07:04.623 sys 0m0.460s 00:07:04.623 19:20:24 alias_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:04.623 19:20:24 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:04.623 ************************************ 00:07:04.623 END TEST alias_rpc 00:07:04.623 ************************************ 00:07:04.623 19:20:24 -- spdk/autotest.sh@163 -- # [[ 0 -eq 0 ]] 00:07:04.623 19:20:24 -- spdk/autotest.sh@164 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:04.623 19:20:24 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:04.623 19:20:24 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:04.623 19:20:24 -- common/autotest_common.sh@10 -- # set +x 00:07:04.623 ************************************ 00:07:04.623 START TEST spdkcli_tcp 00:07:04.623 ************************************ 00:07:04.623 19:20:24 spdkcli_tcp -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:04.882 * Looking for test storage... 00:07:04.882 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:07:04.882 19:20:24 spdkcli_tcp -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:04.882 19:20:24 spdkcli_tcp -- common/autotest_common.sh@1693 -- # lcov --version 00:07:04.882 19:20:24 spdkcli_tcp -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:04.882 19:20:24 spdkcli_tcp -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:04.882 19:20:24 spdkcli_tcp -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:04.882 19:20:24 spdkcli_tcp -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:04.882 19:20:24 spdkcli_tcp -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:04.882 19:20:24 spdkcli_tcp -- scripts/common.sh@336 -- # IFS=.-: 00:07:04.882 19:20:24 spdkcli_tcp -- scripts/common.sh@336 -- # read -ra ver1 00:07:04.882 19:20:24 spdkcli_tcp -- scripts/common.sh@337 -- # IFS=.-: 00:07:04.882 19:20:24 spdkcli_tcp -- scripts/common.sh@337 -- # read -ra ver2 00:07:04.882 19:20:24 spdkcli_tcp -- scripts/common.sh@338 -- # local 'op=<' 00:07:04.882 19:20:24 spdkcli_tcp -- scripts/common.sh@340 -- # ver1_l=2 00:07:04.882 19:20:24 spdkcli_tcp -- scripts/common.sh@341 -- # ver2_l=1 00:07:04.883 19:20:24 spdkcli_tcp -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:04.883 19:20:24 spdkcli_tcp -- scripts/common.sh@344 -- # case "$op" in 00:07:04.883 19:20:24 spdkcli_tcp -- scripts/common.sh@345 -- # : 1 00:07:04.883 19:20:24 spdkcli_tcp -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:04.883 19:20:24 spdkcli_tcp -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:04.883 19:20:24 spdkcli_tcp -- scripts/common.sh@365 -- # decimal 1 00:07:04.883 19:20:24 spdkcli_tcp -- scripts/common.sh@353 -- # local d=1 00:07:04.883 19:20:24 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:04.883 19:20:24 spdkcli_tcp -- scripts/common.sh@355 -- # echo 1 00:07:04.883 19:20:24 spdkcli_tcp -- scripts/common.sh@365 -- # ver1[v]=1 00:07:04.883 19:20:24 spdkcli_tcp -- scripts/common.sh@366 -- # decimal 2 00:07:04.883 19:20:24 spdkcli_tcp -- scripts/common.sh@353 -- # local d=2 00:07:04.883 19:20:24 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:04.883 19:20:24 spdkcli_tcp -- scripts/common.sh@355 -- # echo 2 00:07:04.883 19:20:24 spdkcli_tcp -- scripts/common.sh@366 -- # ver2[v]=2 00:07:04.883 19:20:24 spdkcli_tcp -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:04.883 19:20:24 spdkcli_tcp -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:04.883 19:20:24 spdkcli_tcp -- scripts/common.sh@368 -- # return 0 00:07:04.883 19:20:24 spdkcli_tcp -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:04.883 19:20:24 spdkcli_tcp -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:04.883 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.883 --rc genhtml_branch_coverage=1 00:07:04.883 --rc genhtml_function_coverage=1 00:07:04.883 --rc genhtml_legend=1 00:07:04.883 --rc geninfo_all_blocks=1 00:07:04.883 --rc geninfo_unexecuted_blocks=1 00:07:04.883 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.883 ' 00:07:04.883 19:20:24 spdkcli_tcp -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:04.883 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.883 --rc genhtml_branch_coverage=1 00:07:04.883 --rc genhtml_function_coverage=1 00:07:04.883 --rc genhtml_legend=1 00:07:04.883 --rc geninfo_all_blocks=1 00:07:04.883 --rc geninfo_unexecuted_blocks=1 00:07:04.883 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.883 ' 00:07:04.883 19:20:24 spdkcli_tcp -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:04.883 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.883 --rc genhtml_branch_coverage=1 00:07:04.883 --rc genhtml_function_coverage=1 00:07:04.883 --rc genhtml_legend=1 00:07:04.883 --rc geninfo_all_blocks=1 00:07:04.883 --rc geninfo_unexecuted_blocks=1 00:07:04.883 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.883 ' 00:07:04.883 19:20:24 spdkcli_tcp -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:04.883 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.883 --rc genhtml_branch_coverage=1 00:07:04.883 --rc genhtml_function_coverage=1 00:07:04.883 --rc genhtml_legend=1 00:07:04.883 --rc geninfo_all_blocks=1 00:07:04.883 --rc geninfo_unexecuted_blocks=1 00:07:04.883 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.883 ' 00:07:04.883 19:20:24 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:07:04.883 19:20:24 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:07:04.883 19:20:24 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:07:04.883 19:20:24 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:07:04.883 19:20:24 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:07:04.883 19:20:24 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:07:04.883 19:20:24 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:07:04.883 19:20:24 spdkcli_tcp -- common/autotest_common.sh@726 -- # xtrace_disable 00:07:04.883 19:20:24 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:04.883 19:20:24 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1736294 00:07:04.883 19:20:24 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 1736294 00:07:04.883 19:20:24 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:07:04.883 19:20:24 spdkcli_tcp -- common/autotest_common.sh@835 -- # '[' -z 1736294 ']' 00:07:04.883 19:20:24 spdkcli_tcp -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:04.883 19:20:24 spdkcli_tcp -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:04.883 19:20:24 spdkcli_tcp -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:04.883 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:04.883 19:20:24 spdkcli_tcp -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:04.883 19:20:24 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:04.883 [2024-11-29 19:20:24.660461] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:04.883 [2024-11-29 19:20:24.660524] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1736294 ] 00:07:04.883 [2024-11-29 19:20:24.730658] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:04.883 [2024-11-29 19:20:24.754643] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:04.883 [2024-11-29 19:20:24.754646] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.142 19:20:24 spdkcli_tcp -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:05.142 19:20:24 spdkcli_tcp -- common/autotest_common.sh@868 -- # return 0 00:07:05.142 19:20:24 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=1736299 00:07:05.142 19:20:24 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:07:05.142 19:20:24 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:07:05.401 [ 00:07:05.401 "spdk_get_version", 00:07:05.401 "rpc_get_methods", 00:07:05.401 "notify_get_notifications", 00:07:05.401 "notify_get_types", 00:07:05.401 "trace_get_info", 00:07:05.401 "trace_get_tpoint_group_mask", 00:07:05.401 "trace_disable_tpoint_group", 00:07:05.401 "trace_enable_tpoint_group", 00:07:05.401 "trace_clear_tpoint_mask", 00:07:05.401 "trace_set_tpoint_mask", 00:07:05.401 "fsdev_set_opts", 00:07:05.401 "fsdev_get_opts", 00:07:05.401 "framework_get_pci_devices", 00:07:05.401 "framework_get_config", 00:07:05.401 "framework_get_subsystems", 00:07:05.401 "vfu_tgt_set_base_path", 00:07:05.401 "keyring_get_keys", 00:07:05.401 "iobuf_get_stats", 00:07:05.401 "iobuf_set_options", 00:07:05.401 "sock_get_default_impl", 00:07:05.401 "sock_set_default_impl", 00:07:05.401 "sock_impl_set_options", 00:07:05.401 "sock_impl_get_options", 00:07:05.401 "vmd_rescan", 00:07:05.401 "vmd_remove_device", 00:07:05.401 "vmd_enable", 00:07:05.401 "accel_get_stats", 00:07:05.401 "accel_set_options", 00:07:05.401 "accel_set_driver", 00:07:05.401 "accel_crypto_key_destroy", 00:07:05.401 "accel_crypto_keys_get", 00:07:05.401 "accel_crypto_key_create", 00:07:05.401 "accel_assign_opc", 00:07:05.401 "accel_get_module_info", 00:07:05.401 "accel_get_opc_assignments", 00:07:05.401 "bdev_get_histogram", 00:07:05.401 "bdev_enable_histogram", 00:07:05.401 "bdev_set_qos_limit", 00:07:05.401 "bdev_set_qd_sampling_period", 00:07:05.401 "bdev_get_bdevs", 00:07:05.401 "bdev_reset_iostat", 00:07:05.401 "bdev_get_iostat", 00:07:05.401 "bdev_examine", 00:07:05.401 "bdev_wait_for_examine", 00:07:05.401 "bdev_set_options", 00:07:05.401 "scsi_get_devices", 00:07:05.401 "thread_set_cpumask", 00:07:05.401 "scheduler_set_options", 00:07:05.401 "framework_get_governor", 00:07:05.401 "framework_get_scheduler", 00:07:05.401 "framework_set_scheduler", 00:07:05.401 "framework_get_reactors", 00:07:05.401 "thread_get_io_channels", 00:07:05.401 "thread_get_pollers", 00:07:05.401 "thread_get_stats", 00:07:05.401 "framework_monitor_context_switch", 00:07:05.401 "spdk_kill_instance", 00:07:05.401 "log_enable_timestamps", 00:07:05.401 "log_get_flags", 00:07:05.401 "log_clear_flag", 00:07:05.401 "log_set_flag", 00:07:05.401 "log_get_level", 00:07:05.401 "log_set_level", 00:07:05.401 "log_get_print_level", 00:07:05.401 "log_set_print_level", 00:07:05.401 "framework_enable_cpumask_locks", 00:07:05.401 "framework_disable_cpumask_locks", 00:07:05.401 "framework_wait_init", 00:07:05.401 "framework_start_init", 00:07:05.401 "virtio_blk_create_transport", 00:07:05.401 "virtio_blk_get_transports", 00:07:05.401 "vhost_controller_set_coalescing", 00:07:05.401 "vhost_get_controllers", 00:07:05.401 "vhost_delete_controller", 00:07:05.401 "vhost_create_blk_controller", 00:07:05.401 "vhost_scsi_controller_remove_target", 00:07:05.401 "vhost_scsi_controller_add_target", 00:07:05.401 "vhost_start_scsi_controller", 00:07:05.401 "vhost_create_scsi_controller", 00:07:05.401 "ublk_recover_disk", 00:07:05.401 "ublk_get_disks", 00:07:05.401 "ublk_stop_disk", 00:07:05.402 "ublk_start_disk", 00:07:05.402 "ublk_destroy_target", 00:07:05.402 "ublk_create_target", 00:07:05.402 "nbd_get_disks", 00:07:05.402 "nbd_stop_disk", 00:07:05.402 "nbd_start_disk", 00:07:05.402 "env_dpdk_get_mem_stats", 00:07:05.402 "nvmf_stop_mdns_prr", 00:07:05.402 "nvmf_publish_mdns_prr", 00:07:05.402 "nvmf_subsystem_get_listeners", 00:07:05.402 "nvmf_subsystem_get_qpairs", 00:07:05.402 "nvmf_subsystem_get_controllers", 00:07:05.402 "nvmf_get_stats", 00:07:05.402 "nvmf_get_transports", 00:07:05.402 "nvmf_create_transport", 00:07:05.402 "nvmf_get_targets", 00:07:05.402 "nvmf_delete_target", 00:07:05.402 "nvmf_create_target", 00:07:05.402 "nvmf_subsystem_allow_any_host", 00:07:05.402 "nvmf_subsystem_set_keys", 00:07:05.402 "nvmf_subsystem_remove_host", 00:07:05.402 "nvmf_subsystem_add_host", 00:07:05.402 "nvmf_ns_remove_host", 00:07:05.402 "nvmf_ns_add_host", 00:07:05.402 "nvmf_subsystem_remove_ns", 00:07:05.402 "nvmf_subsystem_set_ns_ana_group", 00:07:05.402 "nvmf_subsystem_add_ns", 00:07:05.402 "nvmf_subsystem_listener_set_ana_state", 00:07:05.402 "nvmf_discovery_get_referrals", 00:07:05.402 "nvmf_discovery_remove_referral", 00:07:05.402 "nvmf_discovery_add_referral", 00:07:05.402 "nvmf_subsystem_remove_listener", 00:07:05.402 "nvmf_subsystem_add_listener", 00:07:05.402 "nvmf_delete_subsystem", 00:07:05.402 "nvmf_create_subsystem", 00:07:05.402 "nvmf_get_subsystems", 00:07:05.402 "nvmf_set_crdt", 00:07:05.402 "nvmf_set_config", 00:07:05.402 "nvmf_set_max_subsystems", 00:07:05.402 "iscsi_get_histogram", 00:07:05.402 "iscsi_enable_histogram", 00:07:05.402 "iscsi_set_options", 00:07:05.402 "iscsi_get_auth_groups", 00:07:05.402 "iscsi_auth_group_remove_secret", 00:07:05.402 "iscsi_auth_group_add_secret", 00:07:05.402 "iscsi_delete_auth_group", 00:07:05.402 "iscsi_create_auth_group", 00:07:05.402 "iscsi_set_discovery_auth", 00:07:05.402 "iscsi_get_options", 00:07:05.402 "iscsi_target_node_request_logout", 00:07:05.402 "iscsi_target_node_set_redirect", 00:07:05.402 "iscsi_target_node_set_auth", 00:07:05.402 "iscsi_target_node_add_lun", 00:07:05.402 "iscsi_get_stats", 00:07:05.402 "iscsi_get_connections", 00:07:05.402 "iscsi_portal_group_set_auth", 00:07:05.402 "iscsi_start_portal_group", 00:07:05.402 "iscsi_delete_portal_group", 00:07:05.402 "iscsi_create_portal_group", 00:07:05.402 "iscsi_get_portal_groups", 00:07:05.402 "iscsi_delete_target_node", 00:07:05.402 "iscsi_target_node_remove_pg_ig_maps", 00:07:05.402 "iscsi_target_node_add_pg_ig_maps", 00:07:05.402 "iscsi_create_target_node", 00:07:05.402 "iscsi_get_target_nodes", 00:07:05.402 "iscsi_delete_initiator_group", 00:07:05.402 "iscsi_initiator_group_remove_initiators", 00:07:05.402 "iscsi_initiator_group_add_initiators", 00:07:05.402 "iscsi_create_initiator_group", 00:07:05.402 "iscsi_get_initiator_groups", 00:07:05.402 "fsdev_aio_delete", 00:07:05.402 "fsdev_aio_create", 00:07:05.402 "keyring_linux_set_options", 00:07:05.402 "keyring_file_remove_key", 00:07:05.402 "keyring_file_add_key", 00:07:05.402 "vfu_virtio_create_fs_endpoint", 00:07:05.402 "vfu_virtio_create_scsi_endpoint", 00:07:05.402 "vfu_virtio_scsi_remove_target", 00:07:05.402 "vfu_virtio_scsi_add_target", 00:07:05.402 "vfu_virtio_create_blk_endpoint", 00:07:05.402 "vfu_virtio_delete_endpoint", 00:07:05.402 "iaa_scan_accel_module", 00:07:05.402 "dsa_scan_accel_module", 00:07:05.402 "ioat_scan_accel_module", 00:07:05.402 "accel_error_inject_error", 00:07:05.402 "bdev_iscsi_delete", 00:07:05.402 "bdev_iscsi_create", 00:07:05.402 "bdev_iscsi_set_options", 00:07:05.402 "bdev_virtio_attach_controller", 00:07:05.402 "bdev_virtio_scsi_get_devices", 00:07:05.402 "bdev_virtio_detach_controller", 00:07:05.402 "bdev_virtio_blk_set_hotplug", 00:07:05.402 "bdev_ftl_set_property", 00:07:05.402 "bdev_ftl_get_properties", 00:07:05.402 "bdev_ftl_get_stats", 00:07:05.402 "bdev_ftl_unmap", 00:07:05.402 "bdev_ftl_unload", 00:07:05.402 "bdev_ftl_delete", 00:07:05.402 "bdev_ftl_load", 00:07:05.402 "bdev_ftl_create", 00:07:05.402 "bdev_aio_delete", 00:07:05.402 "bdev_aio_rescan", 00:07:05.402 "bdev_aio_create", 00:07:05.402 "blobfs_create", 00:07:05.402 "blobfs_detect", 00:07:05.402 "blobfs_set_cache_size", 00:07:05.402 "bdev_zone_block_delete", 00:07:05.402 "bdev_zone_block_create", 00:07:05.402 "bdev_delay_delete", 00:07:05.402 "bdev_delay_create", 00:07:05.402 "bdev_delay_update_latency", 00:07:05.402 "bdev_split_delete", 00:07:05.402 "bdev_split_create", 00:07:05.402 "bdev_error_inject_error", 00:07:05.402 "bdev_error_delete", 00:07:05.402 "bdev_error_create", 00:07:05.402 "bdev_raid_set_options", 00:07:05.402 "bdev_raid_remove_base_bdev", 00:07:05.402 "bdev_raid_add_base_bdev", 00:07:05.402 "bdev_raid_delete", 00:07:05.402 "bdev_raid_create", 00:07:05.402 "bdev_raid_get_bdevs", 00:07:05.402 "bdev_lvol_set_parent_bdev", 00:07:05.402 "bdev_lvol_set_parent", 00:07:05.402 "bdev_lvol_check_shallow_copy", 00:07:05.402 "bdev_lvol_start_shallow_copy", 00:07:05.402 "bdev_lvol_grow_lvstore", 00:07:05.402 "bdev_lvol_get_lvols", 00:07:05.402 "bdev_lvol_get_lvstores", 00:07:05.402 "bdev_lvol_delete", 00:07:05.402 "bdev_lvol_set_read_only", 00:07:05.402 "bdev_lvol_resize", 00:07:05.402 "bdev_lvol_decouple_parent", 00:07:05.402 "bdev_lvol_inflate", 00:07:05.402 "bdev_lvol_rename", 00:07:05.402 "bdev_lvol_clone_bdev", 00:07:05.402 "bdev_lvol_clone", 00:07:05.402 "bdev_lvol_snapshot", 00:07:05.402 "bdev_lvol_create", 00:07:05.402 "bdev_lvol_delete_lvstore", 00:07:05.402 "bdev_lvol_rename_lvstore", 00:07:05.402 "bdev_lvol_create_lvstore", 00:07:05.402 "bdev_passthru_delete", 00:07:05.402 "bdev_passthru_create", 00:07:05.402 "bdev_nvme_cuse_unregister", 00:07:05.402 "bdev_nvme_cuse_register", 00:07:05.402 "bdev_opal_new_user", 00:07:05.402 "bdev_opal_set_lock_state", 00:07:05.402 "bdev_opal_delete", 00:07:05.402 "bdev_opal_get_info", 00:07:05.402 "bdev_opal_create", 00:07:05.402 "bdev_nvme_opal_revert", 00:07:05.402 "bdev_nvme_opal_init", 00:07:05.402 "bdev_nvme_send_cmd", 00:07:05.402 "bdev_nvme_set_keys", 00:07:05.402 "bdev_nvme_get_path_iostat", 00:07:05.402 "bdev_nvme_get_mdns_discovery_info", 00:07:05.402 "bdev_nvme_stop_mdns_discovery", 00:07:05.402 "bdev_nvme_start_mdns_discovery", 00:07:05.402 "bdev_nvme_set_multipath_policy", 00:07:05.402 "bdev_nvme_set_preferred_path", 00:07:05.402 "bdev_nvme_get_io_paths", 00:07:05.402 "bdev_nvme_remove_error_injection", 00:07:05.402 "bdev_nvme_add_error_injection", 00:07:05.402 "bdev_nvme_get_discovery_info", 00:07:05.402 "bdev_nvme_stop_discovery", 00:07:05.402 "bdev_nvme_start_discovery", 00:07:05.402 "bdev_nvme_get_controller_health_info", 00:07:05.402 "bdev_nvme_disable_controller", 00:07:05.402 "bdev_nvme_enable_controller", 00:07:05.402 "bdev_nvme_reset_controller", 00:07:05.402 "bdev_nvme_get_transport_statistics", 00:07:05.402 "bdev_nvme_apply_firmware", 00:07:05.402 "bdev_nvme_detach_controller", 00:07:05.402 "bdev_nvme_get_controllers", 00:07:05.402 "bdev_nvme_attach_controller", 00:07:05.402 "bdev_nvme_set_hotplug", 00:07:05.402 "bdev_nvme_set_options", 00:07:05.402 "bdev_null_resize", 00:07:05.402 "bdev_null_delete", 00:07:05.402 "bdev_null_create", 00:07:05.402 "bdev_malloc_delete", 00:07:05.402 "bdev_malloc_create" 00:07:05.402 ] 00:07:05.402 19:20:25 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:07:05.402 19:20:25 spdkcli_tcp -- common/autotest_common.sh@732 -- # xtrace_disable 00:07:05.402 19:20:25 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:05.402 19:20:25 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:07:05.402 19:20:25 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 1736294 00:07:05.402 19:20:25 spdkcli_tcp -- common/autotest_common.sh@954 -- # '[' -z 1736294 ']' 00:07:05.402 19:20:25 spdkcli_tcp -- common/autotest_common.sh@958 -- # kill -0 1736294 00:07:05.402 19:20:25 spdkcli_tcp -- common/autotest_common.sh@959 -- # uname 00:07:05.402 19:20:25 spdkcli_tcp -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:05.402 19:20:25 spdkcli_tcp -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1736294 00:07:05.402 19:20:25 spdkcli_tcp -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:05.402 19:20:25 spdkcli_tcp -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:05.402 19:20:25 spdkcli_tcp -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1736294' 00:07:05.402 killing process with pid 1736294 00:07:05.402 19:20:25 spdkcli_tcp -- common/autotest_common.sh@973 -- # kill 1736294 00:07:05.402 19:20:25 spdkcli_tcp -- common/autotest_common.sh@978 -- # wait 1736294 00:07:05.661 00:07:05.661 real 0m1.080s 00:07:05.661 user 0m1.797s 00:07:05.661 sys 0m0.485s 00:07:05.661 19:20:25 spdkcli_tcp -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:05.661 19:20:25 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:05.661 ************************************ 00:07:05.661 END TEST spdkcli_tcp 00:07:05.661 ************************************ 00:07:05.661 19:20:25 -- spdk/autotest.sh@167 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:05.661 19:20:25 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:05.661 19:20:25 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:05.661 19:20:25 -- common/autotest_common.sh@10 -- # set +x 00:07:05.921 ************************************ 00:07:05.921 START TEST dpdk_mem_utility 00:07:05.921 ************************************ 00:07:05.921 19:20:25 dpdk_mem_utility -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:05.921 * Looking for test storage... 00:07:05.921 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:07:05.921 19:20:25 dpdk_mem_utility -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:05.921 19:20:25 dpdk_mem_utility -- common/autotest_common.sh@1693 -- # lcov --version 00:07:05.921 19:20:25 dpdk_mem_utility -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:05.921 19:20:25 dpdk_mem_utility -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@336 -- # IFS=.-: 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@336 -- # read -ra ver1 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@337 -- # IFS=.-: 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@337 -- # read -ra ver2 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@338 -- # local 'op=<' 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@340 -- # ver1_l=2 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@341 -- # ver2_l=1 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@344 -- # case "$op" in 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@345 -- # : 1 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@365 -- # decimal 1 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=1 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 1 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@365 -- # ver1[v]=1 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@366 -- # decimal 2 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=2 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 2 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@366 -- # ver2[v]=2 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:05.921 19:20:25 dpdk_mem_utility -- scripts/common.sh@368 -- # return 0 00:07:05.921 19:20:25 dpdk_mem_utility -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:05.921 19:20:25 dpdk_mem_utility -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:05.921 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:05.921 --rc genhtml_branch_coverage=1 00:07:05.921 --rc genhtml_function_coverage=1 00:07:05.921 --rc genhtml_legend=1 00:07:05.921 --rc geninfo_all_blocks=1 00:07:05.921 --rc geninfo_unexecuted_blocks=1 00:07:05.921 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:05.921 ' 00:07:05.921 19:20:25 dpdk_mem_utility -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:05.921 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:05.921 --rc genhtml_branch_coverage=1 00:07:05.921 --rc genhtml_function_coverage=1 00:07:05.921 --rc genhtml_legend=1 00:07:05.921 --rc geninfo_all_blocks=1 00:07:05.921 --rc geninfo_unexecuted_blocks=1 00:07:05.921 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:05.921 ' 00:07:05.921 19:20:25 dpdk_mem_utility -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:05.921 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:05.921 --rc genhtml_branch_coverage=1 00:07:05.921 --rc genhtml_function_coverage=1 00:07:05.921 --rc genhtml_legend=1 00:07:05.921 --rc geninfo_all_blocks=1 00:07:05.921 --rc geninfo_unexecuted_blocks=1 00:07:05.921 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:05.921 ' 00:07:05.921 19:20:25 dpdk_mem_utility -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:05.921 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:05.921 --rc genhtml_branch_coverage=1 00:07:05.921 --rc genhtml_function_coverage=1 00:07:05.921 --rc genhtml_legend=1 00:07:05.921 --rc geninfo_all_blocks=1 00:07:05.921 --rc geninfo_unexecuted_blocks=1 00:07:05.921 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:05.921 ' 00:07:05.921 19:20:25 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:05.921 19:20:25 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1736501 00:07:05.921 19:20:25 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1736501 00:07:05.921 19:20:25 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:05.921 19:20:25 dpdk_mem_utility -- common/autotest_common.sh@835 -- # '[' -z 1736501 ']' 00:07:05.921 19:20:25 dpdk_mem_utility -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:05.921 19:20:25 dpdk_mem_utility -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:05.921 19:20:25 dpdk_mem_utility -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:05.921 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:05.921 19:20:25 dpdk_mem_utility -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:05.921 19:20:25 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:05.921 [2024-11-29 19:20:25.826010] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:05.921 [2024-11-29 19:20:25.826087] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1736501 ] 00:07:06.181 [2024-11-29 19:20:25.897306] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.181 [2024-11-29 19:20:25.919765] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.441 19:20:26 dpdk_mem_utility -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:06.441 19:20:26 dpdk_mem_utility -- common/autotest_common.sh@868 -- # return 0 00:07:06.441 19:20:26 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:07:06.441 19:20:26 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:07:06.441 19:20:26 dpdk_mem_utility -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:06.441 19:20:26 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:06.441 { 00:07:06.441 "filename": "/tmp/spdk_mem_dump.txt" 00:07:06.441 } 00:07:06.441 19:20:26 dpdk_mem_utility -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:06.441 19:20:26 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:06.441 DPDK memory size 818.000000 MiB in 1 heap(s) 00:07:06.441 1 heaps totaling size 818.000000 MiB 00:07:06.441 size: 818.000000 MiB heap id: 0 00:07:06.441 end heaps---------- 00:07:06.441 9 mempools totaling size 603.782043 MiB 00:07:06.441 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:07:06.441 size: 158.602051 MiB name: PDU_data_out_Pool 00:07:06.441 size: 100.555481 MiB name: bdev_io_1736501 00:07:06.441 size: 50.003479 MiB name: msgpool_1736501 00:07:06.441 size: 36.509338 MiB name: fsdev_io_1736501 00:07:06.441 size: 21.763794 MiB name: PDU_Pool 00:07:06.441 size: 19.513306 MiB name: SCSI_TASK_Pool 00:07:06.441 size: 4.133484 MiB name: evtpool_1736501 00:07:06.441 size: 0.026123 MiB name: Session_Pool 00:07:06.441 end mempools------- 00:07:06.441 6 memzones totaling size 4.142822 MiB 00:07:06.441 size: 1.000366 MiB name: RG_ring_0_1736501 00:07:06.441 size: 1.000366 MiB name: RG_ring_1_1736501 00:07:06.441 size: 1.000366 MiB name: RG_ring_4_1736501 00:07:06.441 size: 1.000366 MiB name: RG_ring_5_1736501 00:07:06.441 size: 0.125366 MiB name: RG_ring_2_1736501 00:07:06.441 size: 0.015991 MiB name: RG_ring_3_1736501 00:07:06.441 end memzones------- 00:07:06.441 19:20:26 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:07:06.441 heap id: 0 total size: 818.000000 MiB number of busy elements: 44 number of free elements: 15 00:07:06.441 list of free elements. size: 10.852478 MiB 00:07:06.441 element at address: 0x200019200000 with size: 0.999878 MiB 00:07:06.441 element at address: 0x200019400000 with size: 0.999878 MiB 00:07:06.442 element at address: 0x200000400000 with size: 0.998535 MiB 00:07:06.442 element at address: 0x200032000000 with size: 0.994446 MiB 00:07:06.442 element at address: 0x200008000000 with size: 0.959839 MiB 00:07:06.442 element at address: 0x200012c00000 with size: 0.944275 MiB 00:07:06.442 element at address: 0x200019600000 with size: 0.936584 MiB 00:07:06.442 element at address: 0x200000200000 with size: 0.717346 MiB 00:07:06.442 element at address: 0x20001ae00000 with size: 0.582886 MiB 00:07:06.442 element at address: 0x200000c00000 with size: 0.495422 MiB 00:07:06.442 element at address: 0x200003e00000 with size: 0.490723 MiB 00:07:06.442 element at address: 0x200019800000 with size: 0.485657 MiB 00:07:06.442 element at address: 0x200010600000 with size: 0.481934 MiB 00:07:06.442 element at address: 0x200028200000 with size: 0.410034 MiB 00:07:06.442 element at address: 0x200000800000 with size: 0.355042 MiB 00:07:06.442 list of standard malloc elements. size: 199.218628 MiB 00:07:06.442 element at address: 0x2000081fff80 with size: 132.000122 MiB 00:07:06.442 element at address: 0x200003ffff80 with size: 64.000122 MiB 00:07:06.442 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:07:06.442 element at address: 0x2000194fff80 with size: 1.000122 MiB 00:07:06.442 element at address: 0x2000196fff80 with size: 1.000122 MiB 00:07:06.442 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:07:06.442 element at address: 0x2000196eff00 with size: 0.062622 MiB 00:07:06.442 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:07:06.442 element at address: 0x2000196efdc0 with size: 0.000305 MiB 00:07:06.442 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:07:06.442 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:07:06.442 element at address: 0x2000004ffa00 with size: 0.000183 MiB 00:07:06.442 element at address: 0x2000004ffac0 with size: 0.000183 MiB 00:07:06.442 element at address: 0x2000004ffb80 with size: 0.000183 MiB 00:07:06.442 element at address: 0x2000004ffd80 with size: 0.000183 MiB 00:07:06.442 element at address: 0x2000004ffe40 with size: 0.000183 MiB 00:07:06.442 element at address: 0x20000085ae40 with size: 0.000183 MiB 00:07:06.442 element at address: 0x20000085b040 with size: 0.000183 MiB 00:07:06.442 element at address: 0x20000085b100 with size: 0.000183 MiB 00:07:06.442 element at address: 0x2000008db3c0 with size: 0.000183 MiB 00:07:06.442 element at address: 0x2000008db5c0 with size: 0.000183 MiB 00:07:06.442 element at address: 0x2000008df880 with size: 0.000183 MiB 00:07:06.442 element at address: 0x2000008ffb40 with size: 0.000183 MiB 00:07:06.442 element at address: 0x200000c7ed40 with size: 0.000183 MiB 00:07:06.442 element at address: 0x200000cff000 with size: 0.000183 MiB 00:07:06.442 element at address: 0x200000cff0c0 with size: 0.000183 MiB 00:07:06.442 element at address: 0x200003e7da00 with size: 0.000183 MiB 00:07:06.442 element at address: 0x200003e7dac0 with size: 0.000183 MiB 00:07:06.442 element at address: 0x200003efdd80 with size: 0.000183 MiB 00:07:06.442 element at address: 0x2000080fdd80 with size: 0.000183 MiB 00:07:06.442 element at address: 0x20001067b600 with size: 0.000183 MiB 00:07:06.442 element at address: 0x20001067b6c0 with size: 0.000183 MiB 00:07:06.442 element at address: 0x2000106fb980 with size: 0.000183 MiB 00:07:06.442 element at address: 0x200012cf1bc0 with size: 0.000183 MiB 00:07:06.442 element at address: 0x2000196efc40 with size: 0.000183 MiB 00:07:06.442 element at address: 0x2000196efd00 with size: 0.000183 MiB 00:07:06.442 element at address: 0x2000198bc740 with size: 0.000183 MiB 00:07:06.442 element at address: 0x20001ae95380 with size: 0.000183 MiB 00:07:06.442 element at address: 0x20001ae95440 with size: 0.000183 MiB 00:07:06.442 element at address: 0x200028268f80 with size: 0.000183 MiB 00:07:06.442 element at address: 0x200028269040 with size: 0.000183 MiB 00:07:06.442 element at address: 0x20002826fc40 with size: 0.000183 MiB 00:07:06.442 element at address: 0x20002826fe40 with size: 0.000183 MiB 00:07:06.442 element at address: 0x20002826ff00 with size: 0.000183 MiB 00:07:06.442 list of memzone associated elements. size: 607.928894 MiB 00:07:06.442 element at address: 0x20001ae95500 with size: 211.416748 MiB 00:07:06.442 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:07:06.442 element at address: 0x20002826ffc0 with size: 157.562561 MiB 00:07:06.442 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:07:06.442 element at address: 0x200012df1e80 with size: 100.055054 MiB 00:07:06.442 associated memzone info: size: 100.054932 MiB name: MP_bdev_io_1736501_0 00:07:06.442 element at address: 0x200000dff380 with size: 48.003052 MiB 00:07:06.442 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1736501_0 00:07:06.442 element at address: 0x2000107fdb80 with size: 36.008911 MiB 00:07:06.442 associated memzone info: size: 36.008789 MiB name: MP_fsdev_io_1736501_0 00:07:06.442 element at address: 0x2000199be940 with size: 20.255554 MiB 00:07:06.442 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:07:06.442 element at address: 0x2000321feb40 with size: 18.005066 MiB 00:07:06.442 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:07:06.442 element at address: 0x2000004fff00 with size: 3.000244 MiB 00:07:06.442 associated memzone info: size: 3.000122 MiB name: MP_evtpool_1736501_0 00:07:06.442 element at address: 0x2000009ffe00 with size: 2.000488 MiB 00:07:06.442 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1736501 00:07:06.442 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:07:06.442 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1736501 00:07:06.442 element at address: 0x2000106fba40 with size: 1.008118 MiB 00:07:06.442 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:07:06.442 element at address: 0x2000198bc800 with size: 1.008118 MiB 00:07:06.442 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:07:06.442 element at address: 0x2000080fde40 with size: 1.008118 MiB 00:07:06.442 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:07:06.442 element at address: 0x200003efde40 with size: 1.008118 MiB 00:07:06.442 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:07:06.442 element at address: 0x200000cff180 with size: 1.000488 MiB 00:07:06.442 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1736501 00:07:06.442 element at address: 0x2000008ffc00 with size: 1.000488 MiB 00:07:06.442 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1736501 00:07:06.442 element at address: 0x200012cf1c80 with size: 1.000488 MiB 00:07:06.442 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1736501 00:07:06.442 element at address: 0x2000320fe940 with size: 1.000488 MiB 00:07:06.442 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1736501 00:07:06.442 element at address: 0x20000085b1c0 with size: 0.500488 MiB 00:07:06.442 associated memzone info: size: 0.500366 MiB name: RG_MP_fsdev_io_1736501 00:07:06.442 element at address: 0x200000c7ee00 with size: 0.500488 MiB 00:07:06.442 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1736501 00:07:06.442 element at address: 0x20001067b780 with size: 0.500488 MiB 00:07:06.442 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:07:06.442 element at address: 0x200003e7db80 with size: 0.500488 MiB 00:07:06.442 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:07:06.442 element at address: 0x20001987c540 with size: 0.250488 MiB 00:07:06.442 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:07:06.442 element at address: 0x2000002b7a40 with size: 0.125488 MiB 00:07:06.442 associated memzone info: size: 0.125366 MiB name: RG_MP_evtpool_1736501 00:07:06.442 element at address: 0x2000008df940 with size: 0.125488 MiB 00:07:06.442 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1736501 00:07:06.442 element at address: 0x2000080f5b80 with size: 0.031738 MiB 00:07:06.443 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:07:06.443 element at address: 0x200028269100 with size: 0.023743 MiB 00:07:06.443 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:07:06.443 element at address: 0x2000008db680 with size: 0.016113 MiB 00:07:06.443 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1736501 00:07:06.443 element at address: 0x20002826f240 with size: 0.002441 MiB 00:07:06.443 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:07:06.443 element at address: 0x2000004ffc40 with size: 0.000305 MiB 00:07:06.443 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1736501 00:07:06.443 element at address: 0x2000008db480 with size: 0.000305 MiB 00:07:06.443 associated memzone info: size: 0.000183 MiB name: MP_fsdev_io_1736501 00:07:06.443 element at address: 0x20000085af00 with size: 0.000305 MiB 00:07:06.443 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1736501 00:07:06.443 element at address: 0x20002826fd00 with size: 0.000305 MiB 00:07:06.443 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:07:06.443 19:20:26 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:07:06.443 19:20:26 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1736501 00:07:06.443 19:20:26 dpdk_mem_utility -- common/autotest_common.sh@954 -- # '[' -z 1736501 ']' 00:07:06.443 19:20:26 dpdk_mem_utility -- common/autotest_common.sh@958 -- # kill -0 1736501 00:07:06.443 19:20:26 dpdk_mem_utility -- common/autotest_common.sh@959 -- # uname 00:07:06.443 19:20:26 dpdk_mem_utility -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:06.443 19:20:26 dpdk_mem_utility -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1736501 00:07:06.443 19:20:26 dpdk_mem_utility -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:06.443 19:20:26 dpdk_mem_utility -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:06.443 19:20:26 dpdk_mem_utility -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1736501' 00:07:06.443 killing process with pid 1736501 00:07:06.443 19:20:26 dpdk_mem_utility -- common/autotest_common.sh@973 -- # kill 1736501 00:07:06.443 19:20:26 dpdk_mem_utility -- common/autotest_common.sh@978 -- # wait 1736501 00:07:06.702 00:07:06.702 real 0m0.966s 00:07:06.702 user 0m0.867s 00:07:06.702 sys 0m0.453s 00:07:06.702 19:20:26 dpdk_mem_utility -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:06.702 19:20:26 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:06.702 ************************************ 00:07:06.702 END TEST dpdk_mem_utility 00:07:06.702 ************************************ 00:07:06.962 19:20:26 -- spdk/autotest.sh@168 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:07:06.962 19:20:26 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:06.962 19:20:26 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:06.962 19:20:26 -- common/autotest_common.sh@10 -- # set +x 00:07:06.962 ************************************ 00:07:06.962 START TEST event 00:07:06.962 ************************************ 00:07:06.962 19:20:26 event -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:07:06.962 * Looking for test storage... 00:07:06.962 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:07:06.962 19:20:26 event -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:06.962 19:20:26 event -- common/autotest_common.sh@1693 -- # lcov --version 00:07:06.962 19:20:26 event -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:06.962 19:20:26 event -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:06.962 19:20:26 event -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:06.962 19:20:26 event -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:06.962 19:20:26 event -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:06.962 19:20:26 event -- scripts/common.sh@336 -- # IFS=.-: 00:07:06.962 19:20:26 event -- scripts/common.sh@336 -- # read -ra ver1 00:07:06.962 19:20:26 event -- scripts/common.sh@337 -- # IFS=.-: 00:07:06.962 19:20:26 event -- scripts/common.sh@337 -- # read -ra ver2 00:07:06.962 19:20:26 event -- scripts/common.sh@338 -- # local 'op=<' 00:07:06.962 19:20:26 event -- scripts/common.sh@340 -- # ver1_l=2 00:07:06.962 19:20:26 event -- scripts/common.sh@341 -- # ver2_l=1 00:07:06.962 19:20:26 event -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:06.962 19:20:26 event -- scripts/common.sh@344 -- # case "$op" in 00:07:06.962 19:20:26 event -- scripts/common.sh@345 -- # : 1 00:07:06.962 19:20:26 event -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:06.962 19:20:26 event -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:06.962 19:20:26 event -- scripts/common.sh@365 -- # decimal 1 00:07:06.962 19:20:26 event -- scripts/common.sh@353 -- # local d=1 00:07:06.962 19:20:26 event -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:06.962 19:20:26 event -- scripts/common.sh@355 -- # echo 1 00:07:06.962 19:20:26 event -- scripts/common.sh@365 -- # ver1[v]=1 00:07:06.962 19:20:26 event -- scripts/common.sh@366 -- # decimal 2 00:07:06.962 19:20:26 event -- scripts/common.sh@353 -- # local d=2 00:07:06.962 19:20:26 event -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:06.962 19:20:26 event -- scripts/common.sh@355 -- # echo 2 00:07:06.962 19:20:26 event -- scripts/common.sh@366 -- # ver2[v]=2 00:07:06.962 19:20:26 event -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:06.962 19:20:26 event -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:06.962 19:20:26 event -- scripts/common.sh@368 -- # return 0 00:07:06.962 19:20:26 event -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:06.962 19:20:26 event -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:06.962 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:06.962 --rc genhtml_branch_coverage=1 00:07:06.962 --rc genhtml_function_coverage=1 00:07:06.962 --rc genhtml_legend=1 00:07:06.962 --rc geninfo_all_blocks=1 00:07:06.962 --rc geninfo_unexecuted_blocks=1 00:07:06.962 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:06.962 ' 00:07:06.962 19:20:26 event -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:06.962 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:06.962 --rc genhtml_branch_coverage=1 00:07:06.962 --rc genhtml_function_coverage=1 00:07:06.962 --rc genhtml_legend=1 00:07:06.962 --rc geninfo_all_blocks=1 00:07:06.962 --rc geninfo_unexecuted_blocks=1 00:07:06.962 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:06.962 ' 00:07:06.962 19:20:26 event -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:06.962 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:06.962 --rc genhtml_branch_coverage=1 00:07:06.962 --rc genhtml_function_coverage=1 00:07:06.962 --rc genhtml_legend=1 00:07:06.962 --rc geninfo_all_blocks=1 00:07:06.962 --rc geninfo_unexecuted_blocks=1 00:07:06.962 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:06.962 ' 00:07:06.962 19:20:26 event -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:06.962 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:06.962 --rc genhtml_branch_coverage=1 00:07:06.962 --rc genhtml_function_coverage=1 00:07:06.962 --rc genhtml_legend=1 00:07:06.962 --rc geninfo_all_blocks=1 00:07:06.962 --rc geninfo_unexecuted_blocks=1 00:07:06.962 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:06.962 ' 00:07:06.962 19:20:26 event -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:06.962 19:20:26 event -- bdev/nbd_common.sh@6 -- # set -e 00:07:06.962 19:20:26 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:06.962 19:20:26 event -- common/autotest_common.sh@1105 -- # '[' 6 -le 1 ']' 00:07:06.962 19:20:26 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:06.962 19:20:26 event -- common/autotest_common.sh@10 -- # set +x 00:07:07.221 ************************************ 00:07:07.221 START TEST event_perf 00:07:07.221 ************************************ 00:07:07.221 19:20:26 event.event_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:07.221 Running I/O for 1 seconds...[2024-11-29 19:20:26.890106] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:07.221 [2024-11-29 19:20:26.890188] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1736709 ] 00:07:07.221 [2024-11-29 19:20:26.962301] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:07.221 [2024-11-29 19:20:26.988134] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:07.222 [2024-11-29 19:20:26.988232] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:07.222 [2024-11-29 19:20:26.988317] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:07:07.222 [2024-11-29 19:20:26.988319] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.157 Running I/O for 1 seconds... 00:07:08.157 lcore 0: 192399 00:07:08.157 lcore 1: 192400 00:07:08.157 lcore 2: 192400 00:07:08.157 lcore 3: 192400 00:07:08.157 done. 00:07:08.157 00:07:08.157 real 0m1.147s 00:07:08.157 user 0m4.054s 00:07:08.157 sys 0m0.089s 00:07:08.157 19:20:28 event.event_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:08.157 19:20:28 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:07:08.157 ************************************ 00:07:08.157 END TEST event_perf 00:07:08.157 ************************************ 00:07:08.157 19:20:28 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:08.157 19:20:28 event -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:07:08.157 19:20:28 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:08.157 19:20:28 event -- common/autotest_common.sh@10 -- # set +x 00:07:08.417 ************************************ 00:07:08.417 START TEST event_reactor 00:07:08.417 ************************************ 00:07:08.417 19:20:28 event.event_reactor -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:08.417 [2024-11-29 19:20:28.120425] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:08.417 [2024-11-29 19:20:28.120507] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1736995 ] 00:07:08.417 [2024-11-29 19:20:28.193588] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.417 [2024-11-29 19:20:28.214697] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.356 test_start 00:07:09.356 oneshot 00:07:09.356 tick 100 00:07:09.356 tick 100 00:07:09.356 tick 250 00:07:09.356 tick 100 00:07:09.356 tick 100 00:07:09.356 tick 100 00:07:09.356 tick 250 00:07:09.356 tick 500 00:07:09.356 tick 100 00:07:09.356 tick 100 00:07:09.356 tick 250 00:07:09.356 tick 100 00:07:09.356 tick 100 00:07:09.356 test_end 00:07:09.356 00:07:09.356 real 0m1.143s 00:07:09.356 user 0m1.058s 00:07:09.356 sys 0m0.082s 00:07:09.356 19:20:29 event.event_reactor -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:09.356 19:20:29 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:07:09.356 ************************************ 00:07:09.356 END TEST event_reactor 00:07:09.356 ************************************ 00:07:09.615 19:20:29 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:09.615 19:20:29 event -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:07:09.615 19:20:29 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:09.615 19:20:29 event -- common/autotest_common.sh@10 -- # set +x 00:07:09.615 ************************************ 00:07:09.615 START TEST event_reactor_perf 00:07:09.615 ************************************ 00:07:09.615 19:20:29 event.event_reactor_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:09.615 [2024-11-29 19:20:29.346259] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:09.615 [2024-11-29 19:20:29.346340] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1737275 ] 00:07:09.615 [2024-11-29 19:20:29.418791] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.615 [2024-11-29 19:20:29.438253] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.995 test_start 00:07:10.995 test_end 00:07:10.995 Performance: 964375 events per second 00:07:10.995 00:07:10.995 real 0m1.144s 00:07:10.995 user 0m1.064s 00:07:10.995 sys 0m0.076s 00:07:10.995 19:20:30 event.event_reactor_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:10.995 19:20:30 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:07:10.995 ************************************ 00:07:10.995 END TEST event_reactor_perf 00:07:10.995 ************************************ 00:07:10.995 19:20:30 event -- event/event.sh@49 -- # uname -s 00:07:10.995 19:20:30 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:07:10.995 19:20:30 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:10.995 19:20:30 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:10.995 19:20:30 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:10.995 19:20:30 event -- common/autotest_common.sh@10 -- # set +x 00:07:10.995 ************************************ 00:07:10.995 START TEST event_scheduler 00:07:10.995 ************************************ 00:07:10.995 19:20:30 event.event_scheduler -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:10.995 * Looking for test storage... 00:07:10.995 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:07:10.995 19:20:30 event.event_scheduler -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:10.995 19:20:30 event.event_scheduler -- common/autotest_common.sh@1693 -- # lcov --version 00:07:10.995 19:20:30 event.event_scheduler -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:10.995 19:20:30 event.event_scheduler -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@336 -- # IFS=.-: 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@336 -- # read -ra ver1 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@337 -- # IFS=.-: 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@337 -- # read -ra ver2 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@338 -- # local 'op=<' 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@340 -- # ver1_l=2 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@341 -- # ver2_l=1 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@344 -- # case "$op" in 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@345 -- # : 1 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@365 -- # decimal 1 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@353 -- # local d=1 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@355 -- # echo 1 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@365 -- # ver1[v]=1 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@366 -- # decimal 2 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@353 -- # local d=2 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@355 -- # echo 2 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@366 -- # ver2[v]=2 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:10.995 19:20:30 event.event_scheduler -- scripts/common.sh@368 -- # return 0 00:07:10.995 19:20:30 event.event_scheduler -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:10.995 19:20:30 event.event_scheduler -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:10.995 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.995 --rc genhtml_branch_coverage=1 00:07:10.995 --rc genhtml_function_coverage=1 00:07:10.995 --rc genhtml_legend=1 00:07:10.995 --rc geninfo_all_blocks=1 00:07:10.995 --rc geninfo_unexecuted_blocks=1 00:07:10.995 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.995 ' 00:07:10.995 19:20:30 event.event_scheduler -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:10.995 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.995 --rc genhtml_branch_coverage=1 00:07:10.995 --rc genhtml_function_coverage=1 00:07:10.995 --rc genhtml_legend=1 00:07:10.995 --rc geninfo_all_blocks=1 00:07:10.995 --rc geninfo_unexecuted_blocks=1 00:07:10.995 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.995 ' 00:07:10.995 19:20:30 event.event_scheduler -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:10.995 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.995 --rc genhtml_branch_coverage=1 00:07:10.995 --rc genhtml_function_coverage=1 00:07:10.995 --rc genhtml_legend=1 00:07:10.995 --rc geninfo_all_blocks=1 00:07:10.995 --rc geninfo_unexecuted_blocks=1 00:07:10.995 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.995 ' 00:07:10.995 19:20:30 event.event_scheduler -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:10.995 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.995 --rc genhtml_branch_coverage=1 00:07:10.995 --rc genhtml_function_coverage=1 00:07:10.995 --rc genhtml_legend=1 00:07:10.995 --rc geninfo_all_blocks=1 00:07:10.995 --rc geninfo_unexecuted_blocks=1 00:07:10.995 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.995 ' 00:07:10.995 19:20:30 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:07:10.995 19:20:30 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:07:10.995 19:20:30 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=1737593 00:07:10.995 19:20:30 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:07:10.995 19:20:30 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 1737593 00:07:10.995 19:20:30 event.event_scheduler -- common/autotest_common.sh@835 -- # '[' -z 1737593 ']' 00:07:10.995 19:20:30 event.event_scheduler -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:10.995 19:20:30 event.event_scheduler -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:10.995 19:20:30 event.event_scheduler -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:10.995 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:10.995 19:20:30 event.event_scheduler -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:10.995 19:20:30 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:10.995 [2024-11-29 19:20:30.766133] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:10.995 [2024-11-29 19:20:30.766181] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1737593 ] 00:07:10.995 [2024-11-29 19:20:30.832143] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:10.995 [2024-11-29 19:20:30.858895] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.995 [2024-11-29 19:20:30.858983] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:10.995 [2024-11-29 19:20:30.859000] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:07:10.995 [2024-11-29 19:20:30.859003] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:11.256 19:20:30 event.event_scheduler -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:11.256 19:20:30 event.event_scheduler -- common/autotest_common.sh@868 -- # return 0 00:07:11.256 19:20:30 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:07:11.256 19:20:30 event.event_scheduler -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.256 19:20:30 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:11.256 [2024-11-29 19:20:30.959783] dpdk_governor.c: 178:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:07:11.256 [2024-11-29 19:20:30.959802] scheduler_dynamic.c: 280:init: *NOTICE*: Unable to initialize dpdk governor 00:07:11.256 [2024-11-29 19:20:30.959813] scheduler_dynamic.c: 427:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:07:11.256 [2024-11-29 19:20:30.959821] scheduler_dynamic.c: 429:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:07:11.256 [2024-11-29 19:20:30.959828] scheduler_dynamic.c: 431:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:07:11.256 19:20:30 event.event_scheduler -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.257 19:20:30 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:07:11.257 19:20:30 event.event_scheduler -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.257 19:20:30 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:11.257 [2024-11-29 19:20:31.028841] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:07:11.257 19:20:31 event.event_scheduler -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.257 19:20:31 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:07:11.257 19:20:31 event.event_scheduler -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:11.257 19:20:31 event.event_scheduler -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:11.257 19:20:31 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:11.257 ************************************ 00:07:11.257 START TEST scheduler_create_thread 00:07:11.257 ************************************ 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1129 -- # scheduler_create_thread 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:11.257 2 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:11.257 3 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:11.257 4 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:11.257 5 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:11.257 6 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:11.257 7 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:11.257 8 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:11.257 9 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:11.257 10 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:11.257 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.516 19:20:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:11.516 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.517 19:20:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:12.895 19:20:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:12.895 19:20:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:12.895 19:20:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:12.895 19:20:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:12.895 19:20:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:13.833 19:20:33 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:13.833 00:07:13.833 real 0m2.616s 00:07:13.833 user 0m0.024s 00:07:13.833 sys 0m0.007s 00:07:13.833 19:20:33 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:13.833 19:20:33 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:13.833 ************************************ 00:07:13.833 END TEST scheduler_create_thread 00:07:13.833 ************************************ 00:07:13.833 19:20:33 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:13.833 19:20:33 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 1737593 00:07:13.833 19:20:33 event.event_scheduler -- common/autotest_common.sh@954 -- # '[' -z 1737593 ']' 00:07:13.834 19:20:33 event.event_scheduler -- common/autotest_common.sh@958 -- # kill -0 1737593 00:07:13.834 19:20:33 event.event_scheduler -- common/autotest_common.sh@959 -- # uname 00:07:13.834 19:20:33 event.event_scheduler -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:13.834 19:20:33 event.event_scheduler -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1737593 00:07:14.095 19:20:33 event.event_scheduler -- common/autotest_common.sh@960 -- # process_name=reactor_2 00:07:14.095 19:20:33 event.event_scheduler -- common/autotest_common.sh@964 -- # '[' reactor_2 = sudo ']' 00:07:14.095 19:20:33 event.event_scheduler -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1737593' 00:07:14.095 killing process with pid 1737593 00:07:14.095 19:20:33 event.event_scheduler -- common/autotest_common.sh@973 -- # kill 1737593 00:07:14.095 19:20:33 event.event_scheduler -- common/autotest_common.sh@978 -- # wait 1737593 00:07:14.354 [2024-11-29 19:20:34.158877] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:14.614 00:07:14.614 real 0m3.752s 00:07:14.614 user 0m5.742s 00:07:14.614 sys 0m0.412s 00:07:14.614 19:20:34 event.event_scheduler -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:14.614 19:20:34 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:14.614 ************************************ 00:07:14.614 END TEST event_scheduler 00:07:14.614 ************************************ 00:07:14.614 19:20:34 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:14.614 19:20:34 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:14.614 19:20:34 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:14.614 19:20:34 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:14.614 19:20:34 event -- common/autotest_common.sh@10 -- # set +x 00:07:14.614 ************************************ 00:07:14.614 START TEST app_repeat 00:07:14.614 ************************************ 00:07:14.615 19:20:34 event.app_repeat -- common/autotest_common.sh@1129 -- # app_repeat_test 00:07:14.615 19:20:34 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:14.615 19:20:34 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:14.615 19:20:34 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:14.615 19:20:34 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:14.615 19:20:34 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:14.615 19:20:34 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:14.615 19:20:34 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:14.615 19:20:34 event.app_repeat -- event/event.sh@19 -- # repeat_pid=1738189 00:07:14.615 19:20:34 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:14.615 19:20:34 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:14.615 19:20:34 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1738189' 00:07:14.615 Process app_repeat pid: 1738189 00:07:14.615 19:20:34 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:14.615 19:20:34 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:14.615 spdk_app_start Round 0 00:07:14.615 19:20:34 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1738189 /var/tmp/spdk-nbd.sock 00:07:14.615 19:20:34 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 1738189 ']' 00:07:14.615 19:20:34 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:14.615 19:20:34 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:14.615 19:20:34 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:14.615 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:14.615 19:20:34 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:14.615 19:20:34 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:14.615 [2024-11-29 19:20:34.432497] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:14.615 [2024-11-29 19:20:34.432590] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1738189 ] 00:07:14.615 [2024-11-29 19:20:34.506621] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:14.874 [2024-11-29 19:20:34.531135] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:14.874 [2024-11-29 19:20:34.531138] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.874 19:20:34 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:14.874 19:20:34 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:14.874 19:20:34 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:15.134 Malloc0 00:07:15.134 19:20:34 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:15.134 Malloc1 00:07:15.398 19:20:35 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:15.398 19:20:35 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:15.398 19:20:35 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:15.398 19:20:35 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:15.398 19:20:35 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:15.398 19:20:35 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:15.398 19:20:35 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:15.398 19:20:35 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:15.398 19:20:35 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:15.398 19:20:35 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:15.398 19:20:35 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:15.398 19:20:35 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:15.398 19:20:35 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:15.398 19:20:35 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:15.398 19:20:35 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:15.398 19:20:35 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:15.398 /dev/nbd0 00:07:15.398 19:20:35 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:15.398 19:20:35 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:15.399 19:20:35 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:07:15.399 19:20:35 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:15.399 19:20:35 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:15.399 19:20:35 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:15.399 19:20:35 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:07:15.399 19:20:35 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:15.399 19:20:35 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:15.399 19:20:35 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:15.399 19:20:35 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:15.399 1+0 records in 00:07:15.399 1+0 records out 00:07:15.399 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256651 s, 16.0 MB/s 00:07:15.399 19:20:35 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:15.399 19:20:35 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:15.399 19:20:35 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:15.661 19:20:35 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:15.661 19:20:35 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:15.661 19:20:35 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:15.661 19:20:35 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:15.661 19:20:35 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:15.661 /dev/nbd1 00:07:15.661 19:20:35 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:15.661 19:20:35 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:15.661 19:20:35 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:07:15.661 19:20:35 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:15.661 19:20:35 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:15.661 19:20:35 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:15.661 19:20:35 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:07:15.661 19:20:35 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:15.661 19:20:35 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:15.661 19:20:35 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:15.661 19:20:35 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:15.661 1+0 records in 00:07:15.661 1+0 records out 00:07:15.661 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000174998 s, 23.4 MB/s 00:07:15.661 19:20:35 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:15.661 19:20:35 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:15.661 19:20:35 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:15.661 19:20:35 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:15.661 19:20:35 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:15.661 19:20:35 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:15.661 19:20:35 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:15.661 19:20:35 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:15.661 19:20:35 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:15.661 19:20:35 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:15.921 19:20:35 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:15.921 { 00:07:15.921 "nbd_device": "/dev/nbd0", 00:07:15.921 "bdev_name": "Malloc0" 00:07:15.921 }, 00:07:15.921 { 00:07:15.921 "nbd_device": "/dev/nbd1", 00:07:15.921 "bdev_name": "Malloc1" 00:07:15.921 } 00:07:15.921 ]' 00:07:15.921 19:20:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:15.921 19:20:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:15.921 { 00:07:15.921 "nbd_device": "/dev/nbd0", 00:07:15.921 "bdev_name": "Malloc0" 00:07:15.921 }, 00:07:15.921 { 00:07:15.921 "nbd_device": "/dev/nbd1", 00:07:15.921 "bdev_name": "Malloc1" 00:07:15.921 } 00:07:15.921 ]' 00:07:15.921 19:20:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:15.921 /dev/nbd1' 00:07:15.921 19:20:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:15.921 /dev/nbd1' 00:07:15.921 19:20:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:15.921 19:20:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:15.921 19:20:35 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:15.921 19:20:35 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:15.921 19:20:35 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:15.921 19:20:35 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:15.921 19:20:35 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:15.921 19:20:35 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:15.921 19:20:35 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:15.921 19:20:35 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:15.921 19:20:35 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:15.921 19:20:35 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:15.921 256+0 records in 00:07:15.921 256+0 records out 00:07:15.921 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0115772 s, 90.6 MB/s 00:07:15.921 19:20:35 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:15.921 19:20:35 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:15.921 256+0 records in 00:07:15.921 256+0 records out 00:07:15.921 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0195024 s, 53.8 MB/s 00:07:15.921 19:20:35 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:15.921 19:20:35 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:16.181 256+0 records in 00:07:16.181 256+0 records out 00:07:16.181 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0210958 s, 49.7 MB/s 00:07:16.181 19:20:35 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:16.181 19:20:35 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:16.181 19:20:35 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:16.181 19:20:35 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:16.181 19:20:35 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:16.181 19:20:35 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:16.181 19:20:35 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:16.181 19:20:35 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:16.181 19:20:35 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:16.181 19:20:35 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:16.181 19:20:35 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:16.181 19:20:35 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:16.181 19:20:35 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:16.181 19:20:35 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:16.181 19:20:35 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:16.181 19:20:35 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:16.181 19:20:35 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:16.181 19:20:35 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:16.181 19:20:35 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:16.181 19:20:36 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:16.181 19:20:36 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:16.181 19:20:36 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:16.181 19:20:36 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:16.181 19:20:36 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:16.181 19:20:36 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:16.181 19:20:36 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:16.181 19:20:36 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:16.181 19:20:36 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:16.181 19:20:36 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:16.440 19:20:36 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:16.441 19:20:36 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:16.441 19:20:36 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:16.441 19:20:36 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:16.441 19:20:36 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:16.441 19:20:36 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:16.441 19:20:36 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:16.441 19:20:36 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:16.441 19:20:36 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:16.441 19:20:36 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:16.441 19:20:36 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:16.699 19:20:36 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:16.699 19:20:36 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:16.699 19:20:36 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:16.699 19:20:36 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:16.699 19:20:36 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:16.699 19:20:36 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:16.699 19:20:36 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:16.699 19:20:36 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:16.699 19:20:36 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:16.699 19:20:36 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:16.699 19:20:36 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:16.699 19:20:36 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:16.699 19:20:36 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:16.957 19:20:36 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:17.216 [2024-11-29 19:20:36.893475] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:17.216 [2024-11-29 19:20:36.913607] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:17.216 [2024-11-29 19:20:36.913611] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.216 [2024-11-29 19:20:36.952149] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:17.216 [2024-11-29 19:20:36.952193] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:20.503 19:20:39 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:20.503 19:20:39 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:20.503 spdk_app_start Round 1 00:07:20.503 19:20:39 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1738189 /var/tmp/spdk-nbd.sock 00:07:20.503 19:20:39 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 1738189 ']' 00:07:20.503 19:20:39 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:20.503 19:20:39 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:20.503 19:20:39 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:20.503 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:20.503 19:20:39 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:20.503 19:20:39 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:20.503 19:20:39 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:20.503 19:20:39 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:20.503 19:20:39 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:20.503 Malloc0 00:07:20.503 19:20:40 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:20.503 Malloc1 00:07:20.503 19:20:40 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:20.503 19:20:40 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:20.503 19:20:40 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:20.503 19:20:40 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:20.503 19:20:40 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:20.503 19:20:40 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:20.503 19:20:40 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:20.503 19:20:40 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:20.503 19:20:40 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:20.503 19:20:40 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:20.503 19:20:40 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:20.503 19:20:40 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:20.503 19:20:40 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:20.503 19:20:40 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:20.503 19:20:40 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:20.503 19:20:40 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:20.762 /dev/nbd0 00:07:20.762 19:20:40 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:20.762 19:20:40 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:20.762 19:20:40 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:07:20.762 19:20:40 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:20.762 19:20:40 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:20.762 19:20:40 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:20.762 19:20:40 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:07:20.762 19:20:40 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:20.762 19:20:40 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:20.762 19:20:40 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:20.762 19:20:40 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:20.762 1+0 records in 00:07:20.762 1+0 records out 00:07:20.762 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000232772 s, 17.6 MB/s 00:07:20.762 19:20:40 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:20.762 19:20:40 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:20.762 19:20:40 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:20.762 19:20:40 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:20.762 19:20:40 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:20.762 19:20:40 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:20.762 19:20:40 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:20.762 19:20:40 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:21.022 /dev/nbd1 00:07:21.022 19:20:40 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:21.022 19:20:40 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:21.022 19:20:40 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:07:21.022 19:20:40 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:21.022 19:20:40 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:21.022 19:20:40 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:21.022 19:20:40 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:07:21.022 19:20:40 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:21.022 19:20:40 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:21.022 19:20:40 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:21.022 19:20:40 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:21.022 1+0 records in 00:07:21.022 1+0 records out 00:07:21.022 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268232 s, 15.3 MB/s 00:07:21.022 19:20:40 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:21.022 19:20:40 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:21.022 19:20:40 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:21.022 19:20:40 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:21.022 19:20:40 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:21.022 19:20:40 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:21.022 19:20:40 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:21.022 19:20:40 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:21.022 19:20:40 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:21.022 19:20:40 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:21.282 19:20:41 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:21.282 { 00:07:21.282 "nbd_device": "/dev/nbd0", 00:07:21.282 "bdev_name": "Malloc0" 00:07:21.282 }, 00:07:21.282 { 00:07:21.282 "nbd_device": "/dev/nbd1", 00:07:21.282 "bdev_name": "Malloc1" 00:07:21.282 } 00:07:21.282 ]' 00:07:21.282 19:20:41 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:21.282 { 00:07:21.282 "nbd_device": "/dev/nbd0", 00:07:21.282 "bdev_name": "Malloc0" 00:07:21.282 }, 00:07:21.282 { 00:07:21.282 "nbd_device": "/dev/nbd1", 00:07:21.282 "bdev_name": "Malloc1" 00:07:21.282 } 00:07:21.282 ]' 00:07:21.282 19:20:41 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:21.282 19:20:41 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:21.282 /dev/nbd1' 00:07:21.282 19:20:41 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:21.282 /dev/nbd1' 00:07:21.282 19:20:41 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:21.282 19:20:41 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:21.282 19:20:41 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:21.282 19:20:41 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:21.282 19:20:41 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:21.282 19:20:41 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:21.282 19:20:41 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:21.282 19:20:41 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:21.282 19:20:41 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:21.282 19:20:41 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:21.282 19:20:41 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:21.282 19:20:41 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:21.282 256+0 records in 00:07:21.282 256+0 records out 00:07:21.282 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0115739 s, 90.6 MB/s 00:07:21.282 19:20:41 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:21.282 19:20:41 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:21.282 256+0 records in 00:07:21.283 256+0 records out 00:07:21.283 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0198145 s, 52.9 MB/s 00:07:21.283 19:20:41 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:21.283 19:20:41 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:21.283 256+0 records in 00:07:21.283 256+0 records out 00:07:21.283 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0213573 s, 49.1 MB/s 00:07:21.283 19:20:41 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:21.283 19:20:41 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:21.283 19:20:41 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:21.283 19:20:41 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:21.283 19:20:41 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:21.283 19:20:41 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:21.283 19:20:41 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:21.283 19:20:41 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:21.283 19:20:41 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:21.283 19:20:41 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:21.283 19:20:41 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:21.283 19:20:41 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:21.283 19:20:41 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:21.283 19:20:41 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:21.283 19:20:41 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:21.283 19:20:41 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:21.283 19:20:41 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:21.283 19:20:41 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:21.283 19:20:41 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:21.542 19:20:41 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:21.542 19:20:41 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:21.542 19:20:41 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:21.542 19:20:41 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:21.542 19:20:41 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:21.542 19:20:41 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:21.542 19:20:41 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:21.542 19:20:41 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:21.542 19:20:41 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:21.542 19:20:41 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:21.800 19:20:41 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:21.800 19:20:41 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:21.800 19:20:41 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:21.800 19:20:41 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:21.800 19:20:41 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:21.800 19:20:41 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:21.800 19:20:41 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:21.800 19:20:41 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:21.800 19:20:41 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:21.801 19:20:41 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:21.801 19:20:41 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:22.060 19:20:41 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:22.060 19:20:41 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:22.060 19:20:41 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:22.060 19:20:41 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:22.060 19:20:41 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:22.060 19:20:41 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:22.060 19:20:41 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:22.060 19:20:41 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:22.060 19:20:41 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:22.060 19:20:41 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:22.060 19:20:41 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:22.060 19:20:41 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:22.060 19:20:41 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:22.319 19:20:42 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:22.319 [2024-11-29 19:20:42.187416] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:22.319 [2024-11-29 19:20:42.207008] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:22.319 [2024-11-29 19:20:42.207011] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.578 [2024-11-29 19:20:42.247168] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:22.578 [2024-11-29 19:20:42.247212] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:25.868 19:20:45 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:25.868 19:20:45 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:25.868 spdk_app_start Round 2 00:07:25.868 19:20:45 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1738189 /var/tmp/spdk-nbd.sock 00:07:25.868 19:20:45 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 1738189 ']' 00:07:25.868 19:20:45 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:25.868 19:20:45 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:25.868 19:20:45 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:25.868 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:25.868 19:20:45 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:25.868 19:20:45 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:25.868 19:20:45 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:25.868 19:20:45 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:25.868 19:20:45 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:25.868 Malloc0 00:07:25.868 19:20:45 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:25.868 Malloc1 00:07:25.868 19:20:45 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:25.868 19:20:45 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:25.868 19:20:45 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:25.868 19:20:45 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:25.868 19:20:45 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:25.868 19:20:45 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:25.868 19:20:45 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:25.868 19:20:45 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:25.868 19:20:45 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:25.868 19:20:45 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:25.868 19:20:45 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:25.868 19:20:45 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:25.868 19:20:45 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:25.868 19:20:45 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:25.868 19:20:45 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:25.868 19:20:45 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:26.128 /dev/nbd0 00:07:26.128 19:20:45 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:26.128 19:20:45 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:26.128 19:20:45 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:07:26.128 19:20:45 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:26.128 19:20:45 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:26.128 19:20:45 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:26.128 19:20:45 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:07:26.128 19:20:45 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:26.128 19:20:45 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:26.128 19:20:45 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:26.128 19:20:45 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:26.128 1+0 records in 00:07:26.128 1+0 records out 00:07:26.128 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00027405 s, 14.9 MB/s 00:07:26.128 19:20:45 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:26.128 19:20:45 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:26.128 19:20:45 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:26.128 19:20:45 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:26.128 19:20:45 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:26.128 19:20:45 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:26.128 19:20:45 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:26.128 19:20:45 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:26.386 /dev/nbd1 00:07:26.386 19:20:46 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:26.386 19:20:46 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:26.386 19:20:46 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:07:26.386 19:20:46 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:26.386 19:20:46 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:26.386 19:20:46 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:26.386 19:20:46 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:07:26.386 19:20:46 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:26.386 19:20:46 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:26.386 19:20:46 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:26.386 19:20:46 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:26.386 1+0 records in 00:07:26.386 1+0 records out 00:07:26.386 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000244874 s, 16.7 MB/s 00:07:26.386 19:20:46 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:26.386 19:20:46 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:26.386 19:20:46 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:26.386 19:20:46 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:26.386 19:20:46 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:26.386 19:20:46 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:26.386 19:20:46 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:26.386 19:20:46 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:26.386 19:20:46 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:26.386 19:20:46 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:26.645 { 00:07:26.645 "nbd_device": "/dev/nbd0", 00:07:26.645 "bdev_name": "Malloc0" 00:07:26.645 }, 00:07:26.645 { 00:07:26.645 "nbd_device": "/dev/nbd1", 00:07:26.645 "bdev_name": "Malloc1" 00:07:26.645 } 00:07:26.645 ]' 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:26.645 { 00:07:26.645 "nbd_device": "/dev/nbd0", 00:07:26.645 "bdev_name": "Malloc0" 00:07:26.645 }, 00:07:26.645 { 00:07:26.645 "nbd_device": "/dev/nbd1", 00:07:26.645 "bdev_name": "Malloc1" 00:07:26.645 } 00:07:26.645 ]' 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:26.645 /dev/nbd1' 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:26.645 /dev/nbd1' 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:26.645 256+0 records in 00:07:26.645 256+0 records out 00:07:26.645 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0115504 s, 90.8 MB/s 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:26.645 256+0 records in 00:07:26.645 256+0 records out 00:07:26.645 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.019556 s, 53.6 MB/s 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:26.645 256+0 records in 00:07:26.645 256+0 records out 00:07:26.645 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0210701 s, 49.8 MB/s 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:26.645 19:20:46 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:26.904 19:20:46 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:26.904 19:20:46 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:26.904 19:20:46 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:26.904 19:20:46 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:26.904 19:20:46 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:26.904 19:20:46 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:26.904 19:20:46 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:26.904 19:20:46 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:26.904 19:20:46 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:26.904 19:20:46 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:27.163 19:20:46 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:27.163 19:20:46 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:27.163 19:20:46 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:27.163 19:20:46 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:27.163 19:20:46 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:27.163 19:20:46 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:27.163 19:20:46 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:27.163 19:20:46 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:27.163 19:20:46 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:27.163 19:20:46 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:27.163 19:20:46 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:27.164 19:20:47 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:27.164 19:20:47 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:27.164 19:20:47 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:27.423 19:20:47 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:27.423 19:20:47 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:27.423 19:20:47 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:27.423 19:20:47 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:27.423 19:20:47 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:27.423 19:20:47 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:27.423 19:20:47 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:27.423 19:20:47 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:27.423 19:20:47 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:27.423 19:20:47 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:27.423 19:20:47 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:27.683 [2024-11-29 19:20:47.463583] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:27.683 [2024-11-29 19:20:47.483405] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:27.683 [2024-11-29 19:20:47.483407] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.683 [2024-11-29 19:20:47.522841] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:27.683 [2024-11-29 19:20:47.522888] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:31.118 19:20:50 event.app_repeat -- event/event.sh@38 -- # waitforlisten 1738189 /var/tmp/spdk-nbd.sock 00:07:31.118 19:20:50 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 1738189 ']' 00:07:31.118 19:20:50 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:31.118 19:20:50 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:31.118 19:20:50 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:31.118 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:31.118 19:20:50 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:31.118 19:20:50 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:31.118 19:20:50 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:31.118 19:20:50 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:31.118 19:20:50 event.app_repeat -- event/event.sh@39 -- # killprocess 1738189 00:07:31.118 19:20:50 event.app_repeat -- common/autotest_common.sh@954 -- # '[' -z 1738189 ']' 00:07:31.118 19:20:50 event.app_repeat -- common/autotest_common.sh@958 -- # kill -0 1738189 00:07:31.118 19:20:50 event.app_repeat -- common/autotest_common.sh@959 -- # uname 00:07:31.118 19:20:50 event.app_repeat -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:31.118 19:20:50 event.app_repeat -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1738189 00:07:31.118 19:20:50 event.app_repeat -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:31.118 19:20:50 event.app_repeat -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:31.118 19:20:50 event.app_repeat -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1738189' 00:07:31.118 killing process with pid 1738189 00:07:31.118 19:20:50 event.app_repeat -- common/autotest_common.sh@973 -- # kill 1738189 00:07:31.118 19:20:50 event.app_repeat -- common/autotest_common.sh@978 -- # wait 1738189 00:07:31.118 spdk_app_start is called in Round 0. 00:07:31.118 Shutdown signal received, stop current app iteration 00:07:31.118 Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 reinitialization... 00:07:31.118 spdk_app_start is called in Round 1. 00:07:31.118 Shutdown signal received, stop current app iteration 00:07:31.118 Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 reinitialization... 00:07:31.118 spdk_app_start is called in Round 2. 00:07:31.118 Shutdown signal received, stop current app iteration 00:07:31.118 Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 reinitialization... 00:07:31.118 spdk_app_start is called in Round 3. 00:07:31.118 Shutdown signal received, stop current app iteration 00:07:31.118 19:20:50 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:31.118 19:20:50 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:31.118 00:07:31.118 real 0m16.303s 00:07:31.118 user 0m35.137s 00:07:31.118 sys 0m3.236s 00:07:31.118 19:20:50 event.app_repeat -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:31.118 19:20:50 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:31.118 ************************************ 00:07:31.118 END TEST app_repeat 00:07:31.118 ************************************ 00:07:31.118 19:20:50 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:31.118 19:20:50 event -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:31.118 19:20:50 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:31.118 19:20:50 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:31.118 19:20:50 event -- common/autotest_common.sh@10 -- # set +x 00:07:31.118 ************************************ 00:07:31.118 START TEST cpu_locks 00:07:31.118 ************************************ 00:07:31.118 19:20:50 event.cpu_locks -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:31.118 * Looking for test storage... 00:07:31.118 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:07:31.118 19:20:50 event.cpu_locks -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:31.118 19:20:50 event.cpu_locks -- common/autotest_common.sh@1693 -- # lcov --version 00:07:31.118 19:20:50 event.cpu_locks -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:31.118 19:20:50 event.cpu_locks -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@336 -- # IFS=.-: 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@336 -- # read -ra ver1 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@337 -- # IFS=.-: 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@337 -- # read -ra ver2 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@338 -- # local 'op=<' 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@340 -- # ver1_l=2 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@341 -- # ver2_l=1 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@344 -- # case "$op" in 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@345 -- # : 1 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@365 -- # decimal 1 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@353 -- # local d=1 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@355 -- # echo 1 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@365 -- # ver1[v]=1 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@366 -- # decimal 2 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@353 -- # local d=2 00:07:31.118 19:20:50 event.cpu_locks -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:31.119 19:20:50 event.cpu_locks -- scripts/common.sh@355 -- # echo 2 00:07:31.119 19:20:50 event.cpu_locks -- scripts/common.sh@366 -- # ver2[v]=2 00:07:31.119 19:20:50 event.cpu_locks -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:31.119 19:20:50 event.cpu_locks -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:31.119 19:20:50 event.cpu_locks -- scripts/common.sh@368 -- # return 0 00:07:31.119 19:20:50 event.cpu_locks -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:31.119 19:20:50 event.cpu_locks -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:31.119 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.119 --rc genhtml_branch_coverage=1 00:07:31.119 --rc genhtml_function_coverage=1 00:07:31.119 --rc genhtml_legend=1 00:07:31.119 --rc geninfo_all_blocks=1 00:07:31.119 --rc geninfo_unexecuted_blocks=1 00:07:31.119 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.119 ' 00:07:31.119 19:20:50 event.cpu_locks -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:31.119 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.119 --rc genhtml_branch_coverage=1 00:07:31.119 --rc genhtml_function_coverage=1 00:07:31.119 --rc genhtml_legend=1 00:07:31.119 --rc geninfo_all_blocks=1 00:07:31.119 --rc geninfo_unexecuted_blocks=1 00:07:31.119 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.119 ' 00:07:31.119 19:20:50 event.cpu_locks -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:31.119 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.119 --rc genhtml_branch_coverage=1 00:07:31.119 --rc genhtml_function_coverage=1 00:07:31.119 --rc genhtml_legend=1 00:07:31.119 --rc geninfo_all_blocks=1 00:07:31.119 --rc geninfo_unexecuted_blocks=1 00:07:31.119 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.119 ' 00:07:31.119 19:20:50 event.cpu_locks -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:31.119 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.119 --rc genhtml_branch_coverage=1 00:07:31.119 --rc genhtml_function_coverage=1 00:07:31.119 --rc genhtml_legend=1 00:07:31.119 --rc geninfo_all_blocks=1 00:07:31.119 --rc geninfo_unexecuted_blocks=1 00:07:31.119 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.119 ' 00:07:31.119 19:20:50 event.cpu_locks -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:07:31.119 19:20:50 event.cpu_locks -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:07:31.119 19:20:50 event.cpu_locks -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:07:31.119 19:20:50 event.cpu_locks -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:07:31.119 19:20:50 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:31.119 19:20:50 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:31.119 19:20:50 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:31.119 ************************************ 00:07:31.119 START TEST default_locks 00:07:31.119 ************************************ 00:07:31.119 19:20:51 event.cpu_locks.default_locks -- common/autotest_common.sh@1129 -- # default_locks 00:07:31.119 19:20:51 event.cpu_locks.default_locks -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=1741351 00:07:31.119 19:20:51 event.cpu_locks.default_locks -- event/cpu_locks.sh@47 -- # waitforlisten 1741351 00:07:31.119 19:20:51 event.cpu_locks.default_locks -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:31.119 19:20:51 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # '[' -z 1741351 ']' 00:07:31.119 19:20:51 event.cpu_locks.default_locks -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:31.119 19:20:51 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:31.119 19:20:51 event.cpu_locks.default_locks -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:31.119 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:31.119 19:20:51 event.cpu_locks.default_locks -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:31.119 19:20:51 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:31.378 [2024-11-29 19:20:51.032490] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:31.379 [2024-11-29 19:20:51.032547] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1741351 ] 00:07:31.379 [2024-11-29 19:20:51.101111] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.379 [2024-11-29 19:20:51.122847] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.637 19:20:51 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:31.637 19:20:51 event.cpu_locks.default_locks -- common/autotest_common.sh@868 -- # return 0 00:07:31.638 19:20:51 event.cpu_locks.default_locks -- event/cpu_locks.sh@49 -- # locks_exist 1741351 00:07:31.638 19:20:51 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # lslocks -p 1741351 00:07:31.638 19:20:51 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:31.897 lslocks: write error 00:07:31.897 19:20:51 event.cpu_locks.default_locks -- event/cpu_locks.sh@50 -- # killprocess 1741351 00:07:31.897 19:20:51 event.cpu_locks.default_locks -- common/autotest_common.sh@954 -- # '[' -z 1741351 ']' 00:07:31.897 19:20:51 event.cpu_locks.default_locks -- common/autotest_common.sh@958 -- # kill -0 1741351 00:07:31.897 19:20:51 event.cpu_locks.default_locks -- common/autotest_common.sh@959 -- # uname 00:07:31.897 19:20:51 event.cpu_locks.default_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:31.897 19:20:51 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1741351 00:07:31.897 19:20:51 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:31.897 19:20:51 event.cpu_locks.default_locks -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:31.897 19:20:51 event.cpu_locks.default_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1741351' 00:07:31.897 killing process with pid 1741351 00:07:32.156 19:20:51 event.cpu_locks.default_locks -- common/autotest_common.sh@973 -- # kill 1741351 00:07:32.156 19:20:51 event.cpu_locks.default_locks -- common/autotest_common.sh@978 -- # wait 1741351 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- event/cpu_locks.sh@52 -- # NOT waitforlisten 1741351 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- common/autotest_common.sh@652 -- # local es=0 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 1741351 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- common/autotest_common.sh@655 -- # waitforlisten 1741351 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # '[' -z 1741351 ']' 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:32.416 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:32.416 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (1741351) - No such process 00:07:32.416 ERROR: process (pid: 1741351) is no longer running 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- common/autotest_common.sh@868 -- # return 1 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- common/autotest_common.sh@655 -- # es=1 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- event/cpu_locks.sh@54 -- # no_locks 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # local lock_files 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:32.416 00:07:32.416 real 0m1.084s 00:07:32.416 user 0m1.062s 00:07:32.416 sys 0m0.573s 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:32.416 19:20:52 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:32.416 ************************************ 00:07:32.416 END TEST default_locks 00:07:32.416 ************************************ 00:07:32.416 19:20:52 event.cpu_locks -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:07:32.416 19:20:52 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:32.416 19:20:52 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:32.416 19:20:52 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:32.416 ************************************ 00:07:32.416 START TEST default_locks_via_rpc 00:07:32.416 ************************************ 00:07:32.416 19:20:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1129 -- # default_locks_via_rpc 00:07:32.416 19:20:52 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=1741537 00:07:32.416 19:20:52 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@63 -- # waitforlisten 1741537 00:07:32.416 19:20:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 1741537 ']' 00:07:32.416 19:20:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:32.416 19:20:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:32.416 19:20:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:32.416 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:32.416 19:20:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:32.416 19:20:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:32.416 19:20:52 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:32.416 [2024-11-29 19:20:52.192164] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:32.416 [2024-11-29 19:20:52.192244] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1741537 ] 00:07:32.416 [2024-11-29 19:20:52.263594] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.416 [2024-11-29 19:20:52.286666] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.676 19:20:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:32.676 19:20:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:32.676 19:20:52 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:07:32.676 19:20:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:32.676 19:20:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:32.676 19:20:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:32.676 19:20:52 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@67 -- # no_locks 00:07:32.676 19:20:52 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:32.676 19:20:52 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # local lock_files 00:07:32.676 19:20:52 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:32.676 19:20:52 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:07:32.676 19:20:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:32.676 19:20:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:32.676 19:20:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:32.676 19:20:52 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@71 -- # locks_exist 1741537 00:07:32.676 19:20:52 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # lslocks -p 1741537 00:07:32.676 19:20:52 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:33.243 19:20:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@73 -- # killprocess 1741537 00:07:33.243 19:20:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@954 -- # '[' -z 1741537 ']' 00:07:33.243 19:20:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@958 -- # kill -0 1741537 00:07:33.243 19:20:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@959 -- # uname 00:07:33.243 19:20:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:33.243 19:20:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1741537 00:07:33.243 19:20:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:33.243 19:20:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:33.243 19:20:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1741537' 00:07:33.243 killing process with pid 1741537 00:07:33.243 19:20:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@973 -- # kill 1741537 00:07:33.243 19:20:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@978 -- # wait 1741537 00:07:33.502 00:07:33.502 real 0m1.191s 00:07:33.502 user 0m1.162s 00:07:33.502 sys 0m0.580s 00:07:33.502 19:20:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:33.502 19:20:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:33.502 ************************************ 00:07:33.502 END TEST default_locks_via_rpc 00:07:33.502 ************************************ 00:07:33.502 19:20:53 event.cpu_locks -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:07:33.502 19:20:53 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:33.502 19:20:53 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:33.502 19:20:53 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:33.761 ************************************ 00:07:33.761 START TEST non_locking_app_on_locked_coremask 00:07:33.761 ************************************ 00:07:33.761 19:20:53 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1129 -- # non_locking_app_on_locked_coremask 00:07:33.761 19:20:53 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=1741699 00:07:33.761 19:20:53 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@81 -- # waitforlisten 1741699 /var/tmp/spdk.sock 00:07:33.761 19:20:53 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:33.761 19:20:53 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 1741699 ']' 00:07:33.761 19:20:53 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:33.761 19:20:53 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:33.761 19:20:53 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:33.762 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:33.762 19:20:53 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:33.762 19:20:53 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:33.762 [2024-11-29 19:20:53.460577] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:33.762 [2024-11-29 19:20:53.460646] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1741699 ] 00:07:33.762 [2024-11-29 19:20:53.531458] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.762 [2024-11-29 19:20:53.554665] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.020 19:20:53 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:34.020 19:20:53 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:34.020 19:20:53 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=1741847 00:07:34.020 19:20:53 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@85 -- # waitforlisten 1741847 /var/tmp/spdk2.sock 00:07:34.020 19:20:53 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:07:34.020 19:20:53 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 1741847 ']' 00:07:34.020 19:20:53 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:34.020 19:20:53 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:34.020 19:20:53 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:34.020 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:34.020 19:20:53 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:34.020 19:20:53 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:34.020 [2024-11-29 19:20:53.781123] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:34.020 [2024-11-29 19:20:53.781189] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1741847 ] 00:07:34.020 [2024-11-29 19:20:53.879626] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:34.020 [2024-11-29 19:20:53.879656] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.020 [2024-11-29 19:20:53.921886] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.589 19:20:54 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:34.589 19:20:54 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:34.589 19:20:54 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@87 -- # locks_exist 1741699 00:07:34.589 19:20:54 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 1741699 00:07:34.589 19:20:54 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:35.527 lslocks: write error 00:07:35.527 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@89 -- # killprocess 1741699 00:07:35.527 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 1741699 ']' 00:07:35.527 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 1741699 00:07:35.527 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:35.527 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:35.527 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1741699 00:07:35.527 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:35.527 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:35.527 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1741699' 00:07:35.527 killing process with pid 1741699 00:07:35.527 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 1741699 00:07:35.527 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 1741699 00:07:36.095 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@90 -- # killprocess 1741847 00:07:36.095 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 1741847 ']' 00:07:36.095 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 1741847 00:07:36.095 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:36.095 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:36.095 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1741847 00:07:36.095 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:36.095 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:36.095 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1741847' 00:07:36.096 killing process with pid 1741847 00:07:36.096 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 1741847 00:07:36.096 19:20:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 1741847 00:07:36.355 00:07:36.355 real 0m2.681s 00:07:36.355 user 0m2.715s 00:07:36.355 sys 0m1.115s 00:07:36.355 19:20:56 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:36.355 19:20:56 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:36.355 ************************************ 00:07:36.355 END TEST non_locking_app_on_locked_coremask 00:07:36.355 ************************************ 00:07:36.356 19:20:56 event.cpu_locks -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:07:36.356 19:20:56 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:36.356 19:20:56 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:36.356 19:20:56 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:36.356 ************************************ 00:07:36.356 START TEST locking_app_on_unlocked_coremask 00:07:36.356 ************************************ 00:07:36.356 19:20:56 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1129 -- # locking_app_on_unlocked_coremask 00:07:36.356 19:20:56 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=1742254 00:07:36.356 19:20:56 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@99 -- # waitforlisten 1742254 /var/tmp/spdk.sock 00:07:36.356 19:20:56 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:07:36.356 19:20:56 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # '[' -z 1742254 ']' 00:07:36.356 19:20:56 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:36.356 19:20:56 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:36.356 19:20:56 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:36.356 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:36.356 19:20:56 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:36.356 19:20:56 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:36.356 [2024-11-29 19:20:56.222025] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:36.356 [2024-11-29 19:20:56.222080] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1742254 ] 00:07:36.615 [2024-11-29 19:20:56.292363] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:36.615 [2024-11-29 19:20:56.292390] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.615 [2024-11-29 19:20:56.315604] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.615 19:20:56 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:36.615 19:20:56 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:36.615 19:20:56 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=1742260 00:07:36.615 19:20:56 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@103 -- # waitforlisten 1742260 /var/tmp/spdk2.sock 00:07:36.615 19:20:56 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:36.615 19:20:56 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # '[' -z 1742260 ']' 00:07:36.615 19:20:56 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:36.615 19:20:56 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:36.615 19:20:56 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:36.615 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:36.615 19:20:56 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:36.615 19:20:56 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:36.875 [2024-11-29 19:20:56.536441] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:36.875 [2024-11-29 19:20:56.536507] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1742260 ] 00:07:36.875 [2024-11-29 19:20:56.632980] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.875 [2024-11-29 19:20:56.679066] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.134 19:20:57 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:37.134 19:20:57 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:37.134 19:20:57 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@105 -- # locks_exist 1742260 00:07:37.393 19:20:57 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 1742260 00:07:37.393 19:20:57 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:38.773 lslocks: write error 00:07:38.773 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@107 -- # killprocess 1742254 00:07:38.773 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # '[' -z 1742254 ']' 00:07:38.773 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@958 -- # kill -0 1742254 00:07:38.773 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:38.773 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:38.773 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1742254 00:07:38.773 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:38.773 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:38.773 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1742254' 00:07:38.773 killing process with pid 1742254 00:07:38.773 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@973 -- # kill 1742254 00:07:38.773 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@978 -- # wait 1742254 00:07:39.032 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@108 -- # killprocess 1742260 00:07:39.032 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # '[' -z 1742260 ']' 00:07:39.032 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@958 -- # kill -0 1742260 00:07:39.032 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:39.032 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:39.032 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1742260 00:07:39.292 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:39.292 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:39.292 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1742260' 00:07:39.292 killing process with pid 1742260 00:07:39.292 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@973 -- # kill 1742260 00:07:39.292 19:20:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@978 -- # wait 1742260 00:07:39.551 00:07:39.551 real 0m3.030s 00:07:39.551 user 0m3.088s 00:07:39.551 sys 0m1.318s 00:07:39.551 19:20:59 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:39.551 19:20:59 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:39.551 ************************************ 00:07:39.551 END TEST locking_app_on_unlocked_coremask 00:07:39.551 ************************************ 00:07:39.551 19:20:59 event.cpu_locks -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:07:39.551 19:20:59 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:39.551 19:20:59 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:39.551 19:20:59 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:39.551 ************************************ 00:07:39.551 START TEST locking_app_on_locked_coremask 00:07:39.551 ************************************ 00:07:39.551 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1129 -- # locking_app_on_locked_coremask 00:07:39.551 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=1742825 00:07:39.551 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:39.551 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@116 -- # waitforlisten 1742825 /var/tmp/spdk.sock 00:07:39.551 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 1742825 ']' 00:07:39.551 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:39.552 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:39.552 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:39.552 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:39.552 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:39.552 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:39.552 [2024-11-29 19:20:59.329694] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:39.552 [2024-11-29 19:20:59.329749] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1742825 ] 00:07:39.552 [2024-11-29 19:20:59.398935] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.552 [2024-11-29 19:20:59.421859] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.811 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:39.811 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:39.811 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=1742831 00:07:39.811 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@120 -- # NOT waitforlisten 1742831 /var/tmp/spdk2.sock 00:07:39.811 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:39.811 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@652 -- # local es=0 00:07:39.811 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 1742831 /var/tmp/spdk2.sock 00:07:39.811 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:07:39.811 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:39.811 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:07:39.811 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:39.811 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@655 -- # waitforlisten 1742831 /var/tmp/spdk2.sock 00:07:39.811 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 1742831 ']' 00:07:39.811 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:39.811 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:39.811 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:39.811 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:39.811 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:39.811 19:20:59 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:39.811 [2024-11-29 19:20:59.639689] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:39.812 [2024-11-29 19:20:59.639778] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1742831 ] 00:07:40.071 [2024-11-29 19:20:59.735530] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 1742825 has claimed it. 00:07:40.071 [2024-11-29 19:20:59.735561] app.c: 912:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:40.641 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (1742831) - No such process 00:07:40.641 ERROR: process (pid: 1742831) is no longer running 00:07:40.641 19:21:00 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:40.641 19:21:00 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 1 00:07:40.641 19:21:00 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@655 -- # es=1 00:07:40.641 19:21:00 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:40.641 19:21:00 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:40.641 19:21:00 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:40.641 19:21:00 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@122 -- # locks_exist 1742825 00:07:40.641 19:21:00 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 1742825 00:07:40.641 19:21:00 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:40.900 lslocks: write error 00:07:40.900 19:21:00 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@124 -- # killprocess 1742825 00:07:40.900 19:21:00 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 1742825 ']' 00:07:40.900 19:21:00 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 1742825 00:07:40.900 19:21:00 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:40.900 19:21:00 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:40.900 19:21:00 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1742825 00:07:40.900 19:21:00 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:40.900 19:21:00 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:40.900 19:21:00 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1742825' 00:07:40.900 killing process with pid 1742825 00:07:40.900 19:21:00 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 1742825 00:07:40.900 19:21:00 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 1742825 00:07:41.469 00:07:41.469 real 0m1.770s 00:07:41.469 user 0m1.884s 00:07:41.469 sys 0m0.663s 00:07:41.469 19:21:01 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:41.469 19:21:01 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:41.469 ************************************ 00:07:41.469 END TEST locking_app_on_locked_coremask 00:07:41.469 ************************************ 00:07:41.469 19:21:01 event.cpu_locks -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:07:41.469 19:21:01 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:41.469 19:21:01 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:41.469 19:21:01 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:41.469 ************************************ 00:07:41.469 START TEST locking_overlapped_coremask 00:07:41.469 ************************************ 00:07:41.469 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1129 -- # locking_overlapped_coremask 00:07:41.469 19:21:01 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=1743215 00:07:41.469 19:21:01 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@133 -- # waitforlisten 1743215 /var/tmp/spdk.sock 00:07:41.469 19:21:01 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:07:41.469 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # '[' -z 1743215 ']' 00:07:41.469 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:41.469 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:41.469 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:41.469 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:41.469 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:41.469 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:41.469 [2024-11-29 19:21:01.180661] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:41.469 [2024-11-29 19:21:01.180747] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1743215 ] 00:07:41.469 [2024-11-29 19:21:01.253745] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:41.469 [2024-11-29 19:21:01.279233] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:41.469 [2024-11-29 19:21:01.279342] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:41.469 [2024-11-29 19:21:01.279345] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.728 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:41.728 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:41.728 19:21:01 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=1743362 00:07:41.728 19:21:01 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@137 -- # NOT waitforlisten 1743362 /var/tmp/spdk2.sock 00:07:41.728 19:21:01 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:07:41.728 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@652 -- # local es=0 00:07:41.728 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 1743362 /var/tmp/spdk2.sock 00:07:41.728 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:07:41.728 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:41.728 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:07:41.728 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:41.728 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@655 -- # waitforlisten 1743362 /var/tmp/spdk2.sock 00:07:41.728 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # '[' -z 1743362 ']' 00:07:41.728 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:41.728 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:41.728 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:41.728 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:41.728 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:41.728 19:21:01 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:41.728 [2024-11-29 19:21:01.503694] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:41.728 [2024-11-29 19:21:01.503784] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1743362 ] 00:07:41.728 [2024-11-29 19:21:01.605820] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1743215 has claimed it. 00:07:41.728 [2024-11-29 19:21:01.605865] app.c: 912:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:42.297 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (1743362) - No such process 00:07:42.297 ERROR: process (pid: 1743362) is no longer running 00:07:42.297 19:21:02 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:42.297 19:21:02 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@868 -- # return 1 00:07:42.297 19:21:02 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@655 -- # es=1 00:07:42.297 19:21:02 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:42.297 19:21:02 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:42.297 19:21:02 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:42.297 19:21:02 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:07:42.297 19:21:02 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:42.297 19:21:02 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:42.297 19:21:02 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:42.297 19:21:02 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@141 -- # killprocess 1743215 00:07:42.297 19:21:02 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@954 -- # '[' -z 1743215 ']' 00:07:42.297 19:21:02 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@958 -- # kill -0 1743215 00:07:42.297 19:21:02 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@959 -- # uname 00:07:42.297 19:21:02 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:42.297 19:21:02 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1743215 00:07:42.556 19:21:02 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:42.556 19:21:02 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:42.556 19:21:02 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1743215' 00:07:42.556 killing process with pid 1743215 00:07:42.556 19:21:02 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@973 -- # kill 1743215 00:07:42.556 19:21:02 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@978 -- # wait 1743215 00:07:42.816 00:07:42.816 real 0m1.381s 00:07:42.816 user 0m3.863s 00:07:42.816 sys 0m0.443s 00:07:42.816 19:21:02 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:42.816 19:21:02 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:42.816 ************************************ 00:07:42.816 END TEST locking_overlapped_coremask 00:07:42.816 ************************************ 00:07:42.816 19:21:02 event.cpu_locks -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:07:42.816 19:21:02 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:42.816 19:21:02 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:42.816 19:21:02 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:42.816 ************************************ 00:07:42.816 START TEST locking_overlapped_coremask_via_rpc 00:07:42.816 ************************************ 00:07:42.816 19:21:02 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1129 -- # locking_overlapped_coremask_via_rpc 00:07:42.816 19:21:02 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=1743552 00:07:42.816 19:21:02 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@149 -- # waitforlisten 1743552 /var/tmp/spdk.sock 00:07:42.816 19:21:02 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:07:42.816 19:21:02 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 1743552 ']' 00:07:42.816 19:21:02 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:42.816 19:21:02 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:42.816 19:21:02 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:42.816 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:42.816 19:21:02 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:42.816 19:21:02 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:42.816 [2024-11-29 19:21:02.636456] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:42.816 [2024-11-29 19:21:02.636510] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1743552 ] 00:07:42.816 [2024-11-29 19:21:02.705960] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:42.816 [2024-11-29 19:21:02.705986] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:43.076 [2024-11-29 19:21:02.732229] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:43.076 [2024-11-29 19:21:02.732322] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:43.076 [2024-11-29 19:21:02.732324] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.076 19:21:02 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:43.076 19:21:02 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:43.076 19:21:02 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=1743627 00:07:43.076 19:21:02 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@153 -- # waitforlisten 1743627 /var/tmp/spdk2.sock 00:07:43.076 19:21:02 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:07:43.076 19:21:02 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 1743627 ']' 00:07:43.076 19:21:02 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:43.076 19:21:02 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:43.076 19:21:02 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:43.076 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:43.076 19:21:02 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:43.076 19:21:02 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:43.076 [2024-11-29 19:21:02.950805] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:43.076 [2024-11-29 19:21:02.950894] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1743627 ] 00:07:43.336 [2024-11-29 19:21:03.051298] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:43.336 [2024-11-29 19:21:03.051330] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:43.336 [2024-11-29 19:21:03.100810] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:07:43.336 [2024-11-29 19:21:03.104650] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:43.336 [2024-11-29 19:21:03.104651] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 4 00:07:43.905 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:43.905 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@652 -- # local es=0 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@654 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@640 -- # local arg=rpc_cmd 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # type -t rpc_cmd 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@655 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:44.165 [2024-11-29 19:21:03.836665] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1743552 has claimed it. 00:07:44.165 request: 00:07:44.165 { 00:07:44.165 "method": "framework_enable_cpumask_locks", 00:07:44.165 "req_id": 1 00:07:44.165 } 00:07:44.165 Got JSON-RPC error response 00:07:44.165 response: 00:07:44.165 { 00:07:44.165 "code": -32603, 00:07:44.165 "message": "Failed to claim CPU core: 2" 00:07:44.165 } 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@655 -- # es=1 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@158 -- # waitforlisten 1743552 /var/tmp/spdk.sock 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 1743552 ']' 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:44.165 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:44.165 19:21:03 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:44.165 19:21:04 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:44.165 19:21:04 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:44.165 19:21:04 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@159 -- # waitforlisten 1743627 /var/tmp/spdk2.sock 00:07:44.165 19:21:04 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 1743627 ']' 00:07:44.165 19:21:04 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:44.165 19:21:04 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:44.165 19:21:04 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:44.165 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:44.165 19:21:04 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:44.165 19:21:04 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:44.425 19:21:04 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:44.425 19:21:04 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:44.425 19:21:04 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:07:44.425 19:21:04 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:44.425 19:21:04 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:44.425 19:21:04 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:44.425 00:07:44.425 real 0m1.652s 00:07:44.425 user 0m0.787s 00:07:44.425 sys 0m0.175s 00:07:44.425 19:21:04 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:44.425 19:21:04 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:44.425 ************************************ 00:07:44.425 END TEST locking_overlapped_coremask_via_rpc 00:07:44.425 ************************************ 00:07:44.425 19:21:04 event.cpu_locks -- event/cpu_locks.sh@174 -- # cleanup 00:07:44.425 19:21:04 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 1743552 ]] 00:07:44.425 19:21:04 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 1743552 00:07:44.425 19:21:04 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 1743552 ']' 00:07:44.425 19:21:04 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 1743552 00:07:44.425 19:21:04 event.cpu_locks -- common/autotest_common.sh@959 -- # uname 00:07:44.425 19:21:04 event.cpu_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:44.425 19:21:04 event.cpu_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1743552 00:07:44.685 19:21:04 event.cpu_locks -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:44.685 19:21:04 event.cpu_locks -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:44.685 19:21:04 event.cpu_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1743552' 00:07:44.685 killing process with pid 1743552 00:07:44.685 19:21:04 event.cpu_locks -- common/autotest_common.sh@973 -- # kill 1743552 00:07:44.685 19:21:04 event.cpu_locks -- common/autotest_common.sh@978 -- # wait 1743552 00:07:44.945 19:21:04 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 1743627 ]] 00:07:44.945 19:21:04 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 1743627 00:07:44.945 19:21:04 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 1743627 ']' 00:07:44.945 19:21:04 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 1743627 00:07:44.945 19:21:04 event.cpu_locks -- common/autotest_common.sh@959 -- # uname 00:07:44.945 19:21:04 event.cpu_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:44.945 19:21:04 event.cpu_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1743627 00:07:44.945 19:21:04 event.cpu_locks -- common/autotest_common.sh@960 -- # process_name=reactor_2 00:07:44.945 19:21:04 event.cpu_locks -- common/autotest_common.sh@964 -- # '[' reactor_2 = sudo ']' 00:07:44.945 19:21:04 event.cpu_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1743627' 00:07:44.945 killing process with pid 1743627 00:07:44.945 19:21:04 event.cpu_locks -- common/autotest_common.sh@973 -- # kill 1743627 00:07:44.945 19:21:04 event.cpu_locks -- common/autotest_common.sh@978 -- # wait 1743627 00:07:45.205 19:21:05 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:45.205 19:21:05 event.cpu_locks -- event/cpu_locks.sh@1 -- # cleanup 00:07:45.205 19:21:05 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 1743552 ]] 00:07:45.205 19:21:05 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 1743552 00:07:45.205 19:21:05 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 1743552 ']' 00:07:45.205 19:21:05 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 1743552 00:07:45.205 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 958: kill: (1743552) - No such process 00:07:45.205 19:21:05 event.cpu_locks -- common/autotest_common.sh@981 -- # echo 'Process with pid 1743552 is not found' 00:07:45.205 Process with pid 1743552 is not found 00:07:45.205 19:21:05 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 1743627 ]] 00:07:45.205 19:21:05 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 1743627 00:07:45.205 19:21:05 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 1743627 ']' 00:07:45.205 19:21:05 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 1743627 00:07:45.205 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 958: kill: (1743627) - No such process 00:07:45.205 19:21:05 event.cpu_locks -- common/autotest_common.sh@981 -- # echo 'Process with pid 1743627 is not found' 00:07:45.205 Process with pid 1743627 is not found 00:07:45.205 19:21:05 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:45.205 00:07:45.205 real 0m14.247s 00:07:45.205 user 0m24.424s 00:07:45.205 sys 0m5.943s 00:07:45.205 19:21:05 event.cpu_locks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:45.205 19:21:05 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:45.205 ************************************ 00:07:45.205 END TEST cpu_locks 00:07:45.205 ************************************ 00:07:45.205 00:07:45.205 real 0m38.430s 00:07:45.205 user 1m11.754s 00:07:45.205 sys 0m10.307s 00:07:45.205 19:21:05 event -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:45.205 19:21:05 event -- common/autotest_common.sh@10 -- # set +x 00:07:45.205 ************************************ 00:07:45.205 END TEST event 00:07:45.205 ************************************ 00:07:45.463 19:21:05 -- spdk/autotest.sh@169 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:45.463 19:21:05 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:45.463 19:21:05 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:45.463 19:21:05 -- common/autotest_common.sh@10 -- # set +x 00:07:45.463 ************************************ 00:07:45.463 START TEST thread 00:07:45.463 ************************************ 00:07:45.463 19:21:05 thread -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:45.463 * Looking for test storage... 00:07:45.463 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:07:45.463 19:21:05 thread -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:45.463 19:21:05 thread -- common/autotest_common.sh@1693 -- # lcov --version 00:07:45.463 19:21:05 thread -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:45.463 19:21:05 thread -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:45.463 19:21:05 thread -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:45.463 19:21:05 thread -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:45.463 19:21:05 thread -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:45.463 19:21:05 thread -- scripts/common.sh@336 -- # IFS=.-: 00:07:45.463 19:21:05 thread -- scripts/common.sh@336 -- # read -ra ver1 00:07:45.463 19:21:05 thread -- scripts/common.sh@337 -- # IFS=.-: 00:07:45.463 19:21:05 thread -- scripts/common.sh@337 -- # read -ra ver2 00:07:45.463 19:21:05 thread -- scripts/common.sh@338 -- # local 'op=<' 00:07:45.463 19:21:05 thread -- scripts/common.sh@340 -- # ver1_l=2 00:07:45.463 19:21:05 thread -- scripts/common.sh@341 -- # ver2_l=1 00:07:45.464 19:21:05 thread -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:45.464 19:21:05 thread -- scripts/common.sh@344 -- # case "$op" in 00:07:45.464 19:21:05 thread -- scripts/common.sh@345 -- # : 1 00:07:45.464 19:21:05 thread -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:45.464 19:21:05 thread -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:45.464 19:21:05 thread -- scripts/common.sh@365 -- # decimal 1 00:07:45.464 19:21:05 thread -- scripts/common.sh@353 -- # local d=1 00:07:45.464 19:21:05 thread -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:45.464 19:21:05 thread -- scripts/common.sh@355 -- # echo 1 00:07:45.464 19:21:05 thread -- scripts/common.sh@365 -- # ver1[v]=1 00:07:45.464 19:21:05 thread -- scripts/common.sh@366 -- # decimal 2 00:07:45.464 19:21:05 thread -- scripts/common.sh@353 -- # local d=2 00:07:45.464 19:21:05 thread -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:45.464 19:21:05 thread -- scripts/common.sh@355 -- # echo 2 00:07:45.464 19:21:05 thread -- scripts/common.sh@366 -- # ver2[v]=2 00:07:45.464 19:21:05 thread -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:45.464 19:21:05 thread -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:45.464 19:21:05 thread -- scripts/common.sh@368 -- # return 0 00:07:45.464 19:21:05 thread -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:45.464 19:21:05 thread -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:45.464 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.464 --rc genhtml_branch_coverage=1 00:07:45.464 --rc genhtml_function_coverage=1 00:07:45.464 --rc genhtml_legend=1 00:07:45.464 --rc geninfo_all_blocks=1 00:07:45.464 --rc geninfo_unexecuted_blocks=1 00:07:45.464 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.464 ' 00:07:45.464 19:21:05 thread -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:45.464 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.464 --rc genhtml_branch_coverage=1 00:07:45.464 --rc genhtml_function_coverage=1 00:07:45.464 --rc genhtml_legend=1 00:07:45.464 --rc geninfo_all_blocks=1 00:07:45.464 --rc geninfo_unexecuted_blocks=1 00:07:45.464 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.464 ' 00:07:45.464 19:21:05 thread -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:45.464 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.464 --rc genhtml_branch_coverage=1 00:07:45.464 --rc genhtml_function_coverage=1 00:07:45.464 --rc genhtml_legend=1 00:07:45.464 --rc geninfo_all_blocks=1 00:07:45.464 --rc geninfo_unexecuted_blocks=1 00:07:45.464 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.464 ' 00:07:45.464 19:21:05 thread -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:45.464 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.464 --rc genhtml_branch_coverage=1 00:07:45.464 --rc genhtml_function_coverage=1 00:07:45.464 --rc genhtml_legend=1 00:07:45.464 --rc geninfo_all_blocks=1 00:07:45.464 --rc geninfo_unexecuted_blocks=1 00:07:45.464 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.464 ' 00:07:45.464 19:21:05 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:45.464 19:21:05 thread -- common/autotest_common.sh@1105 -- # '[' 8 -le 1 ']' 00:07:45.464 19:21:05 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:45.464 19:21:05 thread -- common/autotest_common.sh@10 -- # set +x 00:07:45.722 ************************************ 00:07:45.723 START TEST thread_poller_perf 00:07:45.723 ************************************ 00:07:45.723 19:21:05 thread.thread_poller_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:45.723 [2024-11-29 19:21:05.402360] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:45.723 [2024-11-29 19:21:05.402477] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1744582 ] 00:07:45.723 [2024-11-29 19:21:05.478235] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.723 [2024-11-29 19:21:05.500520] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.723 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:46.660 [2024-11-29T18:21:06.566Z] ====================================== 00:07:46.660 [2024-11-29T18:21:06.566Z] busy:2505225924 (cyc) 00:07:46.660 [2024-11-29T18:21:06.566Z] total_run_count: 832000 00:07:46.660 [2024-11-29T18:21:06.566Z] tsc_hz: 2500000000 (cyc) 00:07:46.660 [2024-11-29T18:21:06.566Z] ====================================== 00:07:46.660 [2024-11-29T18:21:06.566Z] poller_cost: 3011 (cyc), 1204 (nsec) 00:07:46.660 00:07:46.660 real 0m1.153s 00:07:46.660 user 0m1.064s 00:07:46.660 sys 0m0.084s 00:07:46.660 19:21:06 thread.thread_poller_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:46.660 19:21:06 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:46.660 ************************************ 00:07:46.660 END TEST thread_poller_perf 00:07:46.660 ************************************ 00:07:46.919 19:21:06 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:46.919 19:21:06 thread -- common/autotest_common.sh@1105 -- # '[' 8 -le 1 ']' 00:07:46.919 19:21:06 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:46.919 19:21:06 thread -- common/autotest_common.sh@10 -- # set +x 00:07:46.919 ************************************ 00:07:46.919 START TEST thread_poller_perf 00:07:46.919 ************************************ 00:07:46.920 19:21:06 thread.thread_poller_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:46.920 [2024-11-29 19:21:06.629275] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:46.920 [2024-11-29 19:21:06.629371] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1744907 ] 00:07:46.920 [2024-11-29 19:21:06.700905] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.920 [2024-11-29 19:21:06.721977] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.920 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:47.859 [2024-11-29T18:21:07.765Z] ====================================== 00:07:47.859 [2024-11-29T18:21:07.765Z] busy:2501224092 (cyc) 00:07:47.859 [2024-11-29T18:21:07.765Z] total_run_count: 13393000 00:07:47.859 [2024-11-29T18:21:07.765Z] tsc_hz: 2500000000 (cyc) 00:07:47.859 [2024-11-29T18:21:07.765Z] ====================================== 00:07:47.859 [2024-11-29T18:21:07.765Z] poller_cost: 186 (cyc), 74 (nsec) 00:07:47.859 00:07:47.859 real 0m1.138s 00:07:47.859 user 0m1.060s 00:07:47.859 sys 0m0.074s 00:07:47.859 19:21:07 thread.thread_poller_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:47.859 19:21:07 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:47.859 ************************************ 00:07:47.859 END TEST thread_poller_perf 00:07:47.859 ************************************ 00:07:48.119 19:21:07 thread -- thread/thread.sh@17 -- # [[ n != \y ]] 00:07:48.119 19:21:07 thread -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:48.119 19:21:07 thread -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:48.119 19:21:07 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:48.119 19:21:07 thread -- common/autotest_common.sh@10 -- # set +x 00:07:48.119 ************************************ 00:07:48.119 START TEST thread_spdk_lock 00:07:48.119 ************************************ 00:07:48.119 19:21:07 thread.thread_spdk_lock -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:48.119 [2024-11-29 19:21:07.848586] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:48.119 [2024-11-29 19:21:07.848685] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1745131 ] 00:07:48.119 [2024-11-29 19:21:07.921793] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:48.119 [2024-11-29 19:21:07.945133] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:48.119 [2024-11-29 19:21:07.945137] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.688 [2024-11-29 19:21:08.431780] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 980:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:48.688 [2024-11-29 19:21:08.431812] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3112:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:07:48.688 [2024-11-29 19:21:08.431823] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3067:sspin_stacks_print: *ERROR*: spinlock 0x1367c00 00:07:48.688 [2024-11-29 19:21:08.432526] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 875:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:48.688 [2024-11-29 19:21:08.432628] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1041:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:48.688 [2024-11-29 19:21:08.432645] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 875:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:48.688 Starting test contend 00:07:48.688 Worker Delay Wait us Hold us Total us 00:07:48.688 0 3 169244 182919 352164 00:07:48.688 1 5 82291 284544 366836 00:07:48.688 PASS test contend 00:07:48.688 Starting test hold_by_poller 00:07:48.688 PASS test hold_by_poller 00:07:48.688 Starting test hold_by_message 00:07:48.688 PASS test hold_by_message 00:07:48.688 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:07:48.688 100014 assertions passed 00:07:48.688 0 assertions failed 00:07:48.688 00:07:48.688 real 0m0.629s 00:07:48.688 user 0m1.038s 00:07:48.688 sys 0m0.075s 00:07:48.688 19:21:08 thread.thread_spdk_lock -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:48.688 19:21:08 thread.thread_spdk_lock -- common/autotest_common.sh@10 -- # set +x 00:07:48.688 ************************************ 00:07:48.688 END TEST thread_spdk_lock 00:07:48.688 ************************************ 00:07:48.688 00:07:48.688 real 0m3.343s 00:07:48.688 user 0m3.346s 00:07:48.688 sys 0m0.507s 00:07:48.688 19:21:08 thread -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:48.688 19:21:08 thread -- common/autotest_common.sh@10 -- # set +x 00:07:48.688 ************************************ 00:07:48.688 END TEST thread 00:07:48.688 ************************************ 00:07:48.688 19:21:08 -- spdk/autotest.sh@171 -- # [[ 0 -eq 1 ]] 00:07:48.688 19:21:08 -- spdk/autotest.sh@176 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:48.688 19:21:08 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:48.688 19:21:08 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:48.688 19:21:08 -- common/autotest_common.sh@10 -- # set +x 00:07:48.688 ************************************ 00:07:48.689 START TEST app_cmdline 00:07:48.689 ************************************ 00:07:48.689 19:21:08 app_cmdline -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:48.948 * Looking for test storage... 00:07:48.948 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:48.948 19:21:08 app_cmdline -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:48.948 19:21:08 app_cmdline -- common/autotest_common.sh@1693 -- # lcov --version 00:07:48.948 19:21:08 app_cmdline -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:48.948 19:21:08 app_cmdline -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:48.948 19:21:08 app_cmdline -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:48.948 19:21:08 app_cmdline -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:48.948 19:21:08 app_cmdline -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:48.948 19:21:08 app_cmdline -- scripts/common.sh@336 -- # IFS=.-: 00:07:48.948 19:21:08 app_cmdline -- scripts/common.sh@336 -- # read -ra ver1 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@337 -- # IFS=.-: 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@337 -- # read -ra ver2 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@338 -- # local 'op=<' 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@340 -- # ver1_l=2 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@341 -- # ver2_l=1 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@344 -- # case "$op" in 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@345 -- # : 1 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@365 -- # decimal 1 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@353 -- # local d=1 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@355 -- # echo 1 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@365 -- # ver1[v]=1 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@366 -- # decimal 2 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@353 -- # local d=2 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@355 -- # echo 2 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@366 -- # ver2[v]=2 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:48.949 19:21:08 app_cmdline -- scripts/common.sh@368 -- # return 0 00:07:48.949 19:21:08 app_cmdline -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:48.949 19:21:08 app_cmdline -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:48.949 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:48.949 --rc genhtml_branch_coverage=1 00:07:48.949 --rc genhtml_function_coverage=1 00:07:48.949 --rc genhtml_legend=1 00:07:48.949 --rc geninfo_all_blocks=1 00:07:48.949 --rc geninfo_unexecuted_blocks=1 00:07:48.949 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:48.949 ' 00:07:48.949 19:21:08 app_cmdline -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:48.949 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:48.949 --rc genhtml_branch_coverage=1 00:07:48.949 --rc genhtml_function_coverage=1 00:07:48.949 --rc genhtml_legend=1 00:07:48.949 --rc geninfo_all_blocks=1 00:07:48.949 --rc geninfo_unexecuted_blocks=1 00:07:48.949 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:48.949 ' 00:07:48.949 19:21:08 app_cmdline -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:48.949 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:48.949 --rc genhtml_branch_coverage=1 00:07:48.949 --rc genhtml_function_coverage=1 00:07:48.949 --rc genhtml_legend=1 00:07:48.949 --rc geninfo_all_blocks=1 00:07:48.949 --rc geninfo_unexecuted_blocks=1 00:07:48.949 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:48.949 ' 00:07:48.949 19:21:08 app_cmdline -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:48.949 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:48.949 --rc genhtml_branch_coverage=1 00:07:48.949 --rc genhtml_function_coverage=1 00:07:48.949 --rc genhtml_legend=1 00:07:48.949 --rc geninfo_all_blocks=1 00:07:48.949 --rc geninfo_unexecuted_blocks=1 00:07:48.949 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:48.949 ' 00:07:48.949 19:21:08 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:48.949 19:21:08 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=1745271 00:07:48.949 19:21:08 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 1745271 00:07:48.949 19:21:08 app_cmdline -- common/autotest_common.sh@835 -- # '[' -z 1745271 ']' 00:07:48.949 19:21:08 app_cmdline -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:48.949 19:21:08 app_cmdline -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:48.949 19:21:08 app_cmdline -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:48.949 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:48.949 19:21:08 app_cmdline -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:48.949 19:21:08 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:48.949 19:21:08 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:48.949 [2024-11-29 19:21:08.792631] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:48.949 [2024-11-29 19:21:08.792716] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1745271 ] 00:07:49.208 [2024-11-29 19:21:08.863950] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:49.208 [2024-11-29 19:21:08.886646] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:49.208 19:21:09 app_cmdline -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:49.208 19:21:09 app_cmdline -- common/autotest_common.sh@868 -- # return 0 00:07:49.208 19:21:09 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:49.468 { 00:07:49.468 "version": "SPDK v25.01-pre git sha1 35cd3e84d", 00:07:49.468 "fields": { 00:07:49.468 "major": 25, 00:07:49.468 "minor": 1, 00:07:49.468 "patch": 0, 00:07:49.468 "suffix": "-pre", 00:07:49.468 "commit": "35cd3e84d" 00:07:49.468 } 00:07:49.468 } 00:07:49.468 19:21:09 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:07:49.468 19:21:09 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:49.468 19:21:09 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:49.468 19:21:09 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:49.468 19:21:09 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:49.468 19:21:09 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:49.468 19:21:09 app_cmdline -- app/cmdline.sh@26 -- # sort 00:07:49.468 19:21:09 app_cmdline -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:49.468 19:21:09 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:49.468 19:21:09 app_cmdline -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:49.468 19:21:09 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:49.468 19:21:09 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:49.468 19:21:09 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:49.468 19:21:09 app_cmdline -- common/autotest_common.sh@652 -- # local es=0 00:07:49.468 19:21:09 app_cmdline -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:49.468 19:21:09 app_cmdline -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:49.468 19:21:09 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:49.468 19:21:09 app_cmdline -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:49.468 19:21:09 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:49.468 19:21:09 app_cmdline -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:49.468 19:21:09 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:49.468 19:21:09 app_cmdline -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:49.468 19:21:09 app_cmdline -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:49.468 19:21:09 app_cmdline -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:49.728 request: 00:07:49.728 { 00:07:49.728 "method": "env_dpdk_get_mem_stats", 00:07:49.728 "req_id": 1 00:07:49.728 } 00:07:49.728 Got JSON-RPC error response 00:07:49.728 response: 00:07:49.728 { 00:07:49.728 "code": -32601, 00:07:49.728 "message": "Method not found" 00:07:49.728 } 00:07:49.728 19:21:09 app_cmdline -- common/autotest_common.sh@655 -- # es=1 00:07:49.728 19:21:09 app_cmdline -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:49.728 19:21:09 app_cmdline -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:49.728 19:21:09 app_cmdline -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:49.728 19:21:09 app_cmdline -- app/cmdline.sh@1 -- # killprocess 1745271 00:07:49.728 19:21:09 app_cmdline -- common/autotest_common.sh@954 -- # '[' -z 1745271 ']' 00:07:49.728 19:21:09 app_cmdline -- common/autotest_common.sh@958 -- # kill -0 1745271 00:07:49.728 19:21:09 app_cmdline -- common/autotest_common.sh@959 -- # uname 00:07:49.728 19:21:09 app_cmdline -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:49.728 19:21:09 app_cmdline -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1745271 00:07:49.728 19:21:09 app_cmdline -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:49.728 19:21:09 app_cmdline -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:49.728 19:21:09 app_cmdline -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1745271' 00:07:49.728 killing process with pid 1745271 00:07:49.728 19:21:09 app_cmdline -- common/autotest_common.sh@973 -- # kill 1745271 00:07:49.728 19:21:09 app_cmdline -- common/autotest_common.sh@978 -- # wait 1745271 00:07:49.987 00:07:49.987 real 0m1.256s 00:07:49.987 user 0m1.421s 00:07:49.987 sys 0m0.494s 00:07:49.988 19:21:09 app_cmdline -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:49.988 19:21:09 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:49.988 ************************************ 00:07:49.988 END TEST app_cmdline 00:07:49.988 ************************************ 00:07:49.988 19:21:09 -- spdk/autotest.sh@177 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:49.988 19:21:09 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:49.988 19:21:09 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:49.988 19:21:09 -- common/autotest_common.sh@10 -- # set +x 00:07:50.250 ************************************ 00:07:50.250 START TEST version 00:07:50.250 ************************************ 00:07:50.250 19:21:09 version -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:50.250 * Looking for test storage... 00:07:50.250 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:50.250 19:21:10 version -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:50.250 19:21:10 version -- common/autotest_common.sh@1693 -- # lcov --version 00:07:50.250 19:21:10 version -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:50.250 19:21:10 version -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:50.250 19:21:10 version -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:50.250 19:21:10 version -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:50.250 19:21:10 version -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:50.250 19:21:10 version -- scripts/common.sh@336 -- # IFS=.-: 00:07:50.250 19:21:10 version -- scripts/common.sh@336 -- # read -ra ver1 00:07:50.250 19:21:10 version -- scripts/common.sh@337 -- # IFS=.-: 00:07:50.250 19:21:10 version -- scripts/common.sh@337 -- # read -ra ver2 00:07:50.250 19:21:10 version -- scripts/common.sh@338 -- # local 'op=<' 00:07:50.250 19:21:10 version -- scripts/common.sh@340 -- # ver1_l=2 00:07:50.250 19:21:10 version -- scripts/common.sh@341 -- # ver2_l=1 00:07:50.250 19:21:10 version -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:50.250 19:21:10 version -- scripts/common.sh@344 -- # case "$op" in 00:07:50.250 19:21:10 version -- scripts/common.sh@345 -- # : 1 00:07:50.250 19:21:10 version -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:50.250 19:21:10 version -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:50.250 19:21:10 version -- scripts/common.sh@365 -- # decimal 1 00:07:50.250 19:21:10 version -- scripts/common.sh@353 -- # local d=1 00:07:50.250 19:21:10 version -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:50.250 19:21:10 version -- scripts/common.sh@355 -- # echo 1 00:07:50.250 19:21:10 version -- scripts/common.sh@365 -- # ver1[v]=1 00:07:50.250 19:21:10 version -- scripts/common.sh@366 -- # decimal 2 00:07:50.250 19:21:10 version -- scripts/common.sh@353 -- # local d=2 00:07:50.250 19:21:10 version -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:50.250 19:21:10 version -- scripts/common.sh@355 -- # echo 2 00:07:50.250 19:21:10 version -- scripts/common.sh@366 -- # ver2[v]=2 00:07:50.250 19:21:10 version -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:50.250 19:21:10 version -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:50.250 19:21:10 version -- scripts/common.sh@368 -- # return 0 00:07:50.250 19:21:10 version -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:50.250 19:21:10 version -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:50.250 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:50.250 --rc genhtml_branch_coverage=1 00:07:50.250 --rc genhtml_function_coverage=1 00:07:50.250 --rc genhtml_legend=1 00:07:50.250 --rc geninfo_all_blocks=1 00:07:50.250 --rc geninfo_unexecuted_blocks=1 00:07:50.250 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:50.250 ' 00:07:50.250 19:21:10 version -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:50.250 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:50.250 --rc genhtml_branch_coverage=1 00:07:50.250 --rc genhtml_function_coverage=1 00:07:50.250 --rc genhtml_legend=1 00:07:50.250 --rc geninfo_all_blocks=1 00:07:50.250 --rc geninfo_unexecuted_blocks=1 00:07:50.250 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:50.250 ' 00:07:50.250 19:21:10 version -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:50.250 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:50.250 --rc genhtml_branch_coverage=1 00:07:50.250 --rc genhtml_function_coverage=1 00:07:50.250 --rc genhtml_legend=1 00:07:50.250 --rc geninfo_all_blocks=1 00:07:50.250 --rc geninfo_unexecuted_blocks=1 00:07:50.250 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:50.250 ' 00:07:50.250 19:21:10 version -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:50.250 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:50.250 --rc genhtml_branch_coverage=1 00:07:50.250 --rc genhtml_function_coverage=1 00:07:50.250 --rc genhtml_legend=1 00:07:50.250 --rc geninfo_all_blocks=1 00:07:50.250 --rc geninfo_unexecuted_blocks=1 00:07:50.250 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:50.250 ' 00:07:50.250 19:21:10 version -- app/version.sh@17 -- # get_header_version major 00:07:50.250 19:21:10 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:50.250 19:21:10 version -- app/version.sh@14 -- # cut -f2 00:07:50.250 19:21:10 version -- app/version.sh@14 -- # tr -d '"' 00:07:50.250 19:21:10 version -- app/version.sh@17 -- # major=25 00:07:50.250 19:21:10 version -- app/version.sh@18 -- # get_header_version minor 00:07:50.250 19:21:10 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:50.250 19:21:10 version -- app/version.sh@14 -- # cut -f2 00:07:50.250 19:21:10 version -- app/version.sh@14 -- # tr -d '"' 00:07:50.250 19:21:10 version -- app/version.sh@18 -- # minor=1 00:07:50.250 19:21:10 version -- app/version.sh@19 -- # get_header_version patch 00:07:50.250 19:21:10 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:50.250 19:21:10 version -- app/version.sh@14 -- # cut -f2 00:07:50.250 19:21:10 version -- app/version.sh@14 -- # tr -d '"' 00:07:50.250 19:21:10 version -- app/version.sh@19 -- # patch=0 00:07:50.250 19:21:10 version -- app/version.sh@20 -- # get_header_version suffix 00:07:50.250 19:21:10 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:50.250 19:21:10 version -- app/version.sh@14 -- # cut -f2 00:07:50.250 19:21:10 version -- app/version.sh@14 -- # tr -d '"' 00:07:50.250 19:21:10 version -- app/version.sh@20 -- # suffix=-pre 00:07:50.250 19:21:10 version -- app/version.sh@22 -- # version=25.1 00:07:50.250 19:21:10 version -- app/version.sh@25 -- # (( patch != 0 )) 00:07:50.250 19:21:10 version -- app/version.sh@28 -- # version=25.1rc0 00:07:50.250 19:21:10 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:50.250 19:21:10 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:50.508 19:21:10 version -- app/version.sh@30 -- # py_version=25.1rc0 00:07:50.508 19:21:10 version -- app/version.sh@31 -- # [[ 25.1rc0 == \2\5\.\1\r\c\0 ]] 00:07:50.508 00:07:50.508 real 0m0.268s 00:07:50.509 user 0m0.154s 00:07:50.509 sys 0m0.171s 00:07:50.509 19:21:10 version -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:50.509 19:21:10 version -- common/autotest_common.sh@10 -- # set +x 00:07:50.509 ************************************ 00:07:50.509 END TEST version 00:07:50.509 ************************************ 00:07:50.509 19:21:10 -- spdk/autotest.sh@179 -- # '[' 0 -eq 1 ']' 00:07:50.509 19:21:10 -- spdk/autotest.sh@188 -- # [[ 0 -eq 1 ]] 00:07:50.509 19:21:10 -- spdk/autotest.sh@194 -- # uname -s 00:07:50.509 19:21:10 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:07:50.509 19:21:10 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:50.509 19:21:10 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:50.509 19:21:10 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:07:50.509 19:21:10 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:07:50.509 19:21:10 -- spdk/autotest.sh@260 -- # timing_exit lib 00:07:50.509 19:21:10 -- common/autotest_common.sh@732 -- # xtrace_disable 00:07:50.509 19:21:10 -- common/autotest_common.sh@10 -- # set +x 00:07:50.509 19:21:10 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:07:50.509 19:21:10 -- spdk/autotest.sh@267 -- # '[' 0 -eq 1 ']' 00:07:50.509 19:21:10 -- spdk/autotest.sh@276 -- # '[' 0 -eq 1 ']' 00:07:50.509 19:21:10 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:50.509 19:21:10 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:07:50.509 19:21:10 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:07:50.509 19:21:10 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:07:50.509 19:21:10 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:50.509 19:21:10 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:07:50.509 19:21:10 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:50.509 19:21:10 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:50.509 19:21:10 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:07:50.509 19:21:10 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:07:50.509 19:21:10 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:07:50.509 19:21:10 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:07:50.509 19:21:10 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:07:50.509 19:21:10 -- spdk/autotest.sh@374 -- # [[ 1 -eq 1 ]] 00:07:50.509 19:21:10 -- spdk/autotest.sh@375 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:50.509 19:21:10 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:50.509 19:21:10 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:50.509 19:21:10 -- common/autotest_common.sh@10 -- # set +x 00:07:50.509 ************************************ 00:07:50.509 START TEST llvm_fuzz 00:07:50.509 ************************************ 00:07:50.509 19:21:10 llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:50.768 * Looking for test storage... 00:07:50.768 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:50.768 19:21:10 llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:50.768 19:21:10 llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:07:50.768 19:21:10 llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:50.768 19:21:10 llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:50.768 19:21:10 llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:50.768 19:21:10 llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:50.768 19:21:10 llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:50.768 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:50.768 --rc genhtml_branch_coverage=1 00:07:50.768 --rc genhtml_function_coverage=1 00:07:50.768 --rc genhtml_legend=1 00:07:50.768 --rc geninfo_all_blocks=1 00:07:50.768 --rc geninfo_unexecuted_blocks=1 00:07:50.768 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:50.768 ' 00:07:50.768 19:21:10 llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:50.768 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:50.768 --rc genhtml_branch_coverage=1 00:07:50.768 --rc genhtml_function_coverage=1 00:07:50.768 --rc genhtml_legend=1 00:07:50.768 --rc geninfo_all_blocks=1 00:07:50.768 --rc geninfo_unexecuted_blocks=1 00:07:50.768 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:50.768 ' 00:07:50.768 19:21:10 llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:50.768 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:50.768 --rc genhtml_branch_coverage=1 00:07:50.768 --rc genhtml_function_coverage=1 00:07:50.768 --rc genhtml_legend=1 00:07:50.768 --rc geninfo_all_blocks=1 00:07:50.768 --rc geninfo_unexecuted_blocks=1 00:07:50.768 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:50.768 ' 00:07:50.768 19:21:10 llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:50.768 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:50.768 --rc genhtml_branch_coverage=1 00:07:50.768 --rc genhtml_function_coverage=1 00:07:50.768 --rc genhtml_legend=1 00:07:50.768 --rc geninfo_all_blocks=1 00:07:50.768 --rc geninfo_unexecuted_blocks=1 00:07:50.768 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:50.768 ' 00:07:50.768 19:21:10 llvm_fuzz -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:50.768 19:21:10 llvm_fuzz -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:50.768 19:21:10 llvm_fuzz -- common/autotest_common.sh@550 -- # fuzzers=() 00:07:50.768 19:21:10 llvm_fuzz -- common/autotest_common.sh@550 -- # local fuzzers 00:07:50.768 19:21:10 llvm_fuzz -- common/autotest_common.sh@552 -- # [[ -n '' ]] 00:07:50.768 19:21:10 llvm_fuzz -- common/autotest_common.sh@555 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:50.768 19:21:10 llvm_fuzz -- common/autotest_common.sh@556 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:50.768 19:21:10 llvm_fuzz -- common/autotest_common.sh@559 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:50.768 19:21:10 llvm_fuzz -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:50.768 19:21:10 llvm_fuzz -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:50.768 19:21:10 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:50.768 19:21:10 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:50.768 19:21:10 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:50.768 19:21:10 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:50.768 19:21:10 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:50.768 19:21:10 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:50.768 19:21:10 llvm_fuzz -- fuzz/llvm.sh@19 -- # run_test nvmf_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:50.768 19:21:10 llvm_fuzz -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:50.768 19:21:10 llvm_fuzz -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:50.768 19:21:10 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:07:50.768 ************************************ 00:07:50.768 START TEST nvmf_llvm_fuzz 00:07:50.768 ************************************ 00:07:50.768 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:50.768 * Looking for test storage... 00:07:51.031 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:51.031 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.031 --rc genhtml_branch_coverage=1 00:07:51.031 --rc genhtml_function_coverage=1 00:07:51.031 --rc genhtml_legend=1 00:07:51.031 --rc geninfo_all_blocks=1 00:07:51.031 --rc geninfo_unexecuted_blocks=1 00:07:51.031 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.031 ' 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:51.031 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.031 --rc genhtml_branch_coverage=1 00:07:51.031 --rc genhtml_function_coverage=1 00:07:51.031 --rc genhtml_legend=1 00:07:51.031 --rc geninfo_all_blocks=1 00:07:51.031 --rc geninfo_unexecuted_blocks=1 00:07:51.031 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.031 ' 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:51.031 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.031 --rc genhtml_branch_coverage=1 00:07:51.031 --rc genhtml_function_coverage=1 00:07:51.031 --rc genhtml_legend=1 00:07:51.031 --rc geninfo_all_blocks=1 00:07:51.031 --rc geninfo_unexecuted_blocks=1 00:07:51.031 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.031 ' 00:07:51.031 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:51.031 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.032 --rc genhtml_branch_coverage=1 00:07:51.032 --rc genhtml_function_coverage=1 00:07:51.032 --rc genhtml_legend=1 00:07:51.032 --rc geninfo_all_blocks=1 00:07:51.032 --rc geninfo_unexecuted_blocks=1 00:07:51.032 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.032 ' 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@60 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_MAX_NUMA_NODES=1 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_PGO_CAPTURE=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_LTO=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_ISCSI_INITIATOR=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_CET=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_OCF_PATH= 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_RDMA_SET_TOS=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_AIO_FSDEV=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_UBLK=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_ISAL_CRYPTO=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OPENSSL_PATH= 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_OCF=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_FUSE=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_VTUNE_DIR= 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FUZZER=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_FSDEV=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_CRYPTO=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_PGO_USE=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_VHOST=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DAOS=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_DAOS_DIR= 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_UNIT_TESTS=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_VIRTIO=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_DPDK_UADK=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_COVERAGE=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_RDMA=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_HAVE_LZ4=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_URING_PATH= 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_XNVME=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_VFIO_USER=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_ARCH=native 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_HAVE_EVP_MAC=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_URING_ZNS=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_WERROR=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_HAVE_LIBBSD=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_UBSAN=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_IPSEC_MB_DIR= 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_GOLANG=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_ISAL=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_IDXD_KERNEL=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_RDMA_PROV=verbs 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_APPS=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_SHARED=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_HAVE_KEYUTILS=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_FC_PATH= 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_FC=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_AVAHI=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_FIO_PLUGIN=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_RAID5F=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_EXAMPLES=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_TESTS=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_CRYPTO_MLX5=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_MAX_LCORES=128 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_IPSEC_MB=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_PGO_DIR= 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DEBUG=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_CROSS_PREFIX= 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_COPY_FILE_RANGE=y 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@90 -- # CONFIG_URING=n 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:51.032 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:51.033 #define SPDK_CONFIG_H 00:07:51.033 #define SPDK_CONFIG_AIO_FSDEV 1 00:07:51.033 #define SPDK_CONFIG_APPS 1 00:07:51.033 #define SPDK_CONFIG_ARCH native 00:07:51.033 #undef SPDK_CONFIG_ASAN 00:07:51.033 #undef SPDK_CONFIG_AVAHI 00:07:51.033 #undef SPDK_CONFIG_CET 00:07:51.033 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:07:51.033 #define SPDK_CONFIG_COVERAGE 1 00:07:51.033 #define SPDK_CONFIG_CROSS_PREFIX 00:07:51.033 #undef SPDK_CONFIG_CRYPTO 00:07:51.033 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:51.033 #undef SPDK_CONFIG_CUSTOMOCF 00:07:51.033 #undef SPDK_CONFIG_DAOS 00:07:51.033 #define SPDK_CONFIG_DAOS_DIR 00:07:51.033 #define SPDK_CONFIG_DEBUG 1 00:07:51.033 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:51.033 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:51.033 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:51.033 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:51.033 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:51.033 #undef SPDK_CONFIG_DPDK_UADK 00:07:51.033 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:51.033 #define SPDK_CONFIG_EXAMPLES 1 00:07:51.033 #undef SPDK_CONFIG_FC 00:07:51.033 #define SPDK_CONFIG_FC_PATH 00:07:51.033 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:51.033 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:51.033 #define SPDK_CONFIG_FSDEV 1 00:07:51.033 #undef SPDK_CONFIG_FUSE 00:07:51.033 #define SPDK_CONFIG_FUZZER 1 00:07:51.033 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:51.033 #undef SPDK_CONFIG_GOLANG 00:07:51.033 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:51.033 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:07:51.033 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:51.033 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:07:51.033 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:51.033 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:51.033 #undef SPDK_CONFIG_HAVE_LZ4 00:07:51.033 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:07:51.033 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:07:51.033 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:51.033 #define SPDK_CONFIG_IDXD 1 00:07:51.033 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:51.033 #undef SPDK_CONFIG_IPSEC_MB 00:07:51.033 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:51.033 #define SPDK_CONFIG_ISAL 1 00:07:51.033 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:51.033 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:51.033 #define SPDK_CONFIG_LIBDIR 00:07:51.033 #undef SPDK_CONFIG_LTO 00:07:51.033 #define SPDK_CONFIG_MAX_LCORES 128 00:07:51.033 #define SPDK_CONFIG_MAX_NUMA_NODES 1 00:07:51.033 #define SPDK_CONFIG_NVME_CUSE 1 00:07:51.033 #undef SPDK_CONFIG_OCF 00:07:51.033 #define SPDK_CONFIG_OCF_PATH 00:07:51.033 #define SPDK_CONFIG_OPENSSL_PATH 00:07:51.033 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:51.033 #define SPDK_CONFIG_PGO_DIR 00:07:51.033 #undef SPDK_CONFIG_PGO_USE 00:07:51.033 #define SPDK_CONFIG_PREFIX /usr/local 00:07:51.033 #undef SPDK_CONFIG_RAID5F 00:07:51.033 #undef SPDK_CONFIG_RBD 00:07:51.033 #define SPDK_CONFIG_RDMA 1 00:07:51.033 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:51.033 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:51.033 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:51.033 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:51.033 #undef SPDK_CONFIG_SHARED 00:07:51.033 #undef SPDK_CONFIG_SMA 00:07:51.033 #define SPDK_CONFIG_TESTS 1 00:07:51.033 #undef SPDK_CONFIG_TSAN 00:07:51.033 #define SPDK_CONFIG_UBLK 1 00:07:51.033 #define SPDK_CONFIG_UBSAN 1 00:07:51.033 #undef SPDK_CONFIG_UNIT_TESTS 00:07:51.033 #undef SPDK_CONFIG_URING 00:07:51.033 #define SPDK_CONFIG_URING_PATH 00:07:51.033 #undef SPDK_CONFIG_URING_ZNS 00:07:51.033 #undef SPDK_CONFIG_USDT 00:07:51.033 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:51.033 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:51.033 #define SPDK_CONFIG_VFIO_USER 1 00:07:51.033 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:51.033 #define SPDK_CONFIG_VHOST 1 00:07:51.033 #define SPDK_CONFIG_VIRTIO 1 00:07:51.033 #undef SPDK_CONFIG_VTUNE 00:07:51.033 #define SPDK_CONFIG_VTUNE_DIR 00:07:51.033 #define SPDK_CONFIG_WERROR 1 00:07:51.033 #define SPDK_CONFIG_WPDK_DIR 00:07:51.033 #undef SPDK_CONFIG_XNVME 00:07:51.033 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # uname -s 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:51.033 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@140 -- # : v23.11 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@177 -- # : 0 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@178 -- # export SPDK_TEST_NVME_INTERRUPT 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@183 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:51.034 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@183 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@184 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@184 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@187 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@187 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@191 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@191 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@195 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@195 -- # PYTHONDONTWRITEBYTECODE=1 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@199 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@199 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@200 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@200 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@204 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@205 -- # rm -rf /var/tmp/asan_suppression_file 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@206 -- # cat 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@242 -- # echo leak:libfuse3.so 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@246 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@246 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@248 -- # '[' -z /var/spdk/dependencies ']' 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@251 -- # export DEPENDENCY_DIR 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@255 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@255 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@256 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@256 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@259 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@259 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@262 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@262 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@265 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@265 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV_MAIN=0 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV_LLVM=1 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@269 -- # _LCOV= 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ '' == *clang* ]] 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ 1 -eq 1 ]] 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # _LCOV=1 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@272 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@273 -- # _lcov_opt[_LCOV_MAIN]= 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@275 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@278 -- # '[' 0 -eq 0 ']' 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@279 -- # export valgrind= 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@279 -- # valgrind= 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # uname -s 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # '[' Linux = Linux ']' 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@286 -- # HUGEMEM=4096 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@287 -- # export CLEAR_HUGE=yes 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@287 -- # CLEAR_HUGE=yes 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@289 -- # MAKE=make 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@290 -- # MAKEFLAGS=-j112 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@306 -- # export HUGEMEM=4096 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@306 -- # HUGEMEM=4096 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@308 -- # NO_HUGE=() 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@309 -- # TEST_MODE= 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@331 -- # [[ -z 1745758 ]] 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@331 -- # kill -0 1745758 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1678 -- # set_test_storage 2147483648 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@341 -- # [[ -v testdir ]] 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@343 -- # local requested_size=2147483648 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@344 -- # local mount target_dir 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@346 -- # local -A mounts fss sizes avails uses 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@347 -- # local source fs size avail mount use 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@349 -- # local storage_fallback storage_candidates 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@351 -- # mktemp -udt spdk.XXXXXX 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@351 -- # storage_fallback=/tmp/spdk.H82blK 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@356 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@358 -- # [[ -n '' ]] 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@363 -- # [[ -n '' ]] 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@368 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.H82blK/tests/nvmf /tmp/spdk.H82blK 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # requested_size=2214592512 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@340 -- # df -T 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@340 -- # grep -v Filesystem 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_devtmpfs 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=devtmpfs 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=67108864 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=67108864 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=0 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=/dev/pmem0 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=ext2 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=4096 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=5284429824 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5284425728 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:51.035 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_root 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=overlay 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=51117568000 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=61730607104 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=10613039104 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30860537856 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865301504 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=4763648 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=12340129792 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=12346122240 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5992448 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30863355904 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865305600 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=1949696 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=6173044736 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=6173057024 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=12288 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@379 -- # printf '* Looking for test storage...\n' 00:07:51.036 * Looking for test storage... 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@381 -- # local target_space new_size 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@382 -- # for target_dir in "${storage_candidates[@]}" 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # mount=/ 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@387 -- # target_space=51117568000 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@388 -- # (( target_space == 0 || target_space < requested_size )) 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # (( target_space >= requested_size )) 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == tmpfs ]] 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == ramfs ]] 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ / == / ]] 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@394 -- # new_size=12827631616 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@395 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@400 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@400 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@401 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:51.036 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@402 -- # return 0 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1680 -- # set -o errtrace 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # shopt -s extdebug 00:07:51.036 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1682 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:51.296 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1684 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:51.296 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1685 -- # true 00:07:51.296 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1687 -- # xtrace_fd 00:07:51.296 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:51.296 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:51.296 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:07:51.296 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:07:51.296 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:51.296 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:51.296 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:51.296 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:07:51.296 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:51.296 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:07:51.296 19:21:10 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:51.296 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.296 --rc genhtml_branch_coverage=1 00:07:51.296 --rc genhtml_function_coverage=1 00:07:51.296 --rc genhtml_legend=1 00:07:51.296 --rc geninfo_all_blocks=1 00:07:51.296 --rc geninfo_unexecuted_blocks=1 00:07:51.296 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.296 ' 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:51.296 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.296 --rc genhtml_branch_coverage=1 00:07:51.296 --rc genhtml_function_coverage=1 00:07:51.296 --rc genhtml_legend=1 00:07:51.296 --rc geninfo_all_blocks=1 00:07:51.296 --rc geninfo_unexecuted_blocks=1 00:07:51.296 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.296 ' 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:51.296 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.296 --rc genhtml_branch_coverage=1 00:07:51.296 --rc genhtml_function_coverage=1 00:07:51.296 --rc genhtml_legend=1 00:07:51.296 --rc geninfo_all_blocks=1 00:07:51.296 --rc geninfo_unexecuted_blocks=1 00:07:51.296 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.296 ' 00:07:51.296 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:51.296 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.296 --rc genhtml_branch_coverage=1 00:07:51.296 --rc genhtml_function_coverage=1 00:07:51.296 --rc genhtml_legend=1 00:07:51.296 --rc geninfo_all_blocks=1 00:07:51.296 --rc geninfo_unexecuted_blocks=1 00:07:51.296 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.296 ' 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@61 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@63 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # fuzz_num=25 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@65 -- # (( fuzz_num != 0 )) 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@67 -- # trap 'cleanup /tmp/llvm_fuzz* /var/tmp/suppress_nvmf_fuzz; exit 1' SIGINT SIGTERM EXIT 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@69 -- # mem_size=512 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@70 -- # [[ 1 -eq 1 ]] 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@71 -- # start_llvm_fuzz_short 25 1 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=25 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 0 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4400 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:51.297 19:21:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 00:07:51.297 [2024-11-29 19:21:11.092588] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:51.297 [2024-11-29 19:21:11.092679] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1746018 ] 00:07:51.556 [2024-11-29 19:21:11.293039] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.556 [2024-11-29 19:21:11.305375] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.556 [2024-11-29 19:21:11.357991] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:51.556 [2024-11-29 19:21:11.374359] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:51.556 INFO: Running with entropic power schedule (0xFF, 100). 00:07:51.556 INFO: Seed: 1996184539 00:07:51.556 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:07:51.556 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:07:51.556 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:51.556 INFO: A corpus is not provided, starting from an empty corpus 00:07:51.556 #2 INITED exec/s: 0 rss: 64Mb 00:07:51.556 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:51.556 This may also happen if the target rejected all inputs we tried so far 00:07:51.556 [2024-11-29 19:21:11.429784] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.556 [2024-11-29 19:21:11.429812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.075 NEW_FUNC[1/716]: 0x459648 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:52.075 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:52.075 #4 NEW cov: 12235 ft: 12234 corp: 2/84b lim: 320 exec/s: 0 rss: 72Mb L: 83/83 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:52.075 [2024-11-29 19:21:11.740558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (31) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.075 [2024-11-29 19:21:11.740590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.075 NEW_FUNC[1/1]: 0x1994b28 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:07:52.075 #9 NEW cov: 12383 ft: 13180 corp: 3/179b lim: 320 exec/s: 0 rss: 72Mb L: 95/95 MS: 5 ShuffleBytes-ChangeByte-CopyPart-CopyPart-InsertRepeatedBytes- 00:07:52.075 [2024-11-29 19:21:11.780562] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.075 [2024-11-29 19:21:11.780590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.075 #10 NEW cov: 12389 ft: 13469 corp: 4/263b lim: 320 exec/s: 0 rss: 72Mb L: 84/95 MS: 1 CrossOver- 00:07:52.075 [2024-11-29 19:21:11.840731] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:52.075 [2024-11-29 19:21:11.840757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.075 #11 NEW cov: 12474 ft: 13712 corp: 5/380b lim: 320 exec/s: 0 rss: 72Mb L: 117/117 MS: 1 InsertRepeatedBytes- 00:07:52.075 [2024-11-29 19:21:11.880807] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:52.075 [2024-11-29 19:21:11.880833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.075 #12 NEW cov: 12474 ft: 13778 corp: 6/488b lim: 320 exec/s: 0 rss: 72Mb L: 108/117 MS: 1 EraseBytes- 00:07:52.075 [2024-11-29 19:21:11.941023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (31) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.075 [2024-11-29 19:21:11.941049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.334 #13 NEW cov: 12474 ft: 13881 corp: 7/583b lim: 320 exec/s: 0 rss: 72Mb L: 95/117 MS: 1 ShuffleBytes- 00:07:52.334 [2024-11-29 19:21:12.001117] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:52.334 [2024-11-29 19:21:12.001143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.334 #14 NEW cov: 12474 ft: 13958 corp: 8/691b lim: 320 exec/s: 0 rss: 72Mb L: 108/117 MS: 1 ChangeByte- 00:07:52.334 [2024-11-29 19:21:12.061442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (31) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.334 [2024-11-29 19:21:12.061468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.334 [2024-11-29 19:21:12.061529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:52.334 [2024-11-29 19:21:12.061543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.334 NEW_FUNC[1/1]: 0x155e1a8 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2213 00:07:52.334 #15 NEW cov: 12505 ft: 14207 corp: 9/830b lim: 320 exec/s: 0 rss: 73Mb L: 139/139 MS: 1 CrossOver- 00:07:52.334 [2024-11-29 19:21:12.121520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (31) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.334 [2024-11-29 19:21:12.121546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.334 #16 NEW cov: 12505 ft: 14315 corp: 10/925b lim: 320 exec/s: 0 rss: 73Mb L: 95/139 MS: 1 ChangeBit- 00:07:52.334 [2024-11-29 19:21:12.161680] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:52.334 [2024-11-29 19:21:12.161708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.334 #17 NEW cov: 12505 ft: 14384 corp: 11/1041b lim: 320 exec/s: 0 rss: 73Mb L: 116/139 MS: 1 CMP- DE: "\001\224 7\351\242M\350"- 00:07:52.334 [2024-11-29 19:21:12.201715] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:52.334 [2024-11-29 19:21:12.201742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.334 #18 NEW cov: 12505 ft: 14394 corp: 12/1158b lim: 320 exec/s: 0 rss: 73Mb L: 117/139 MS: 1 CrossOver- 00:07:52.593 [2024-11-29 19:21:12.242044] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:52.593 [2024-11-29 19:21:12.242071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.593 [2024-11-29 19:21:12.242136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:52.593 [2024-11-29 19:21:12.242151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.593 [2024-11-29 19:21:12.242215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:52.593 [2024-11-29 19:21:12.242229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.593 #19 NEW cov: 12505 ft: 14576 corp: 13/1383b lim: 320 exec/s: 0 rss: 73Mb L: 225/225 MS: 1 InsertRepeatedBytes- 00:07:52.593 [2024-11-29 19:21:12.281942] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:52.593 [2024-11-29 19:21:12.281968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.593 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:52.593 #20 NEW cov: 12528 ft: 14622 corp: 14/1499b lim: 320 exec/s: 0 rss: 73Mb L: 116/225 MS: 1 ShuffleBytes- 00:07:52.593 [2024-11-29 19:21:12.342106] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:52.593 [2024-11-29 19:21:12.342133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.593 #21 NEW cov: 12528 ft: 14687 corp: 15/1615b lim: 320 exec/s: 0 rss: 73Mb L: 116/225 MS: 1 ChangeBit- 00:07:52.593 [2024-11-29 19:21:12.402301] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x7ffffffe8 00:07:52.593 [2024-11-29 19:21:12.402328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.593 #25 NEW cov: 12528 ft: 14709 corp: 16/1685b lim: 320 exec/s: 25 rss: 73Mb L: 70/225 MS: 4 EraseBytes-ChangeBit-ChangeBinInt-PersAutoDict- DE: "\001\224 7\351\242M\350"- 00:07:52.593 [2024-11-29 19:21:12.442380] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:52.593 [2024-11-29 19:21:12.442407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.593 #26 NEW cov: 12528 ft: 14714 corp: 17/1806b lim: 320 exec/s: 26 rss: 73Mb L: 121/225 MS: 1 CopyPart- 00:07:52.852 [2024-11-29 19:21:12.502548] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.852 [2024-11-29 19:21:12.502574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.852 #27 NEW cov: 12528 ft: 14737 corp: 18/1904b lim: 320 exec/s: 27 rss: 73Mb L: 98/225 MS: 1 CrossOver- 00:07:52.852 [2024-11-29 19:21:12.562951] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:52.852 [2024-11-29 19:21:12.562977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.852 [2024-11-29 19:21:12.563036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e0) qid:0 cid:5 nsid:e0e0e0e0 cdw10:e0e0e0e0 cdw11:e0e0e0e0 00:07:52.852 [2024-11-29 19:21:12.563051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.852 [2024-11-29 19:21:12.563109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e0) qid:0 cid:6 nsid:e0e0e0e0 cdw10:ffffffff cdw11:ffffffff 00:07:52.852 [2024-11-29 19:21:12.563126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.852 #33 NEW cov: 12529 ft: 14797 corp: 19/2124b lim: 320 exec/s: 33 rss: 73Mb L: 220/225 MS: 1 InsertRepeatedBytes- 00:07:52.852 [2024-11-29 19:21:12.602992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:b6b6b6b6 cdw10:b6b6b6b6 cdw11:b6b6b6b6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.852 [2024-11-29 19:21:12.603018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.852 [2024-11-29 19:21:12.603079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b6) qid:0 cid:5 nsid:b6b6b6b6 cdw10:b6b6b6b6 cdw11:b6b6b6b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb6b6b6b6b6b6b6b6 00:07:52.852 [2024-11-29 19:21:12.603093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.852 #36 NEW cov: 12537 ft: 14891 corp: 20/2254b lim: 320 exec/s: 36 rss: 73Mb L: 130/225 MS: 3 PersAutoDict-ShuffleBytes-InsertRepeatedBytes- DE: "\001\224 7\351\242M\350"- 00:07:52.852 [2024-11-29 19:21:12.642938] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.852 [2024-11-29 19:21:12.642964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.852 #37 NEW cov: 12537 ft: 14903 corp: 21/2352b lim: 320 exec/s: 37 rss: 73Mb L: 98/225 MS: 1 CrossOver- 00:07:52.852 [2024-11-29 19:21:12.703096] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:52.852 [2024-11-29 19:21:12.703122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.852 #38 NEW cov: 12537 ft: 14918 corp: 22/2468b lim: 320 exec/s: 38 rss: 73Mb L: 116/225 MS: 1 ChangeASCIIInt- 00:07:52.852 [2024-11-29 19:21:12.743377] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:43434343 SGL TRANSPORT DATA BLOCK TRANSPORT 0x4343434343434343 00:07:52.852 [2024-11-29 19:21:12.743403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.852 [2024-11-29 19:21:12.743469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (43) qid:0 cid:5 nsid:43434343 cdw10:ffffffff cdw11:ffffff7f SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffff434343 00:07:52.852 [2024-11-29 19:21:12.743483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.852 [2024-11-29 19:21:12.743546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:a2e93720 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:52.852 [2024-11-29 19:21:12.743559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.110 #39 NEW cov: 12537 ft: 14954 corp: 23/2660b lim: 320 exec/s: 39 rss: 73Mb L: 192/225 MS: 1 InsertRepeatedBytes- 00:07:53.110 [2024-11-29 19:21:12.803385] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:53.110 [2024-11-29 19:21:12.803411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.110 #40 NEW cov: 12537 ft: 14985 corp: 24/2768b lim: 320 exec/s: 40 rss: 73Mb L: 108/225 MS: 1 ChangeBinInt- 00:07:53.110 [2024-11-29 19:21:12.843737] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:53.110 [2024-11-29 19:21:12.843761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.110 [2024-11-29 19:21:12.843823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e0) qid:0 cid:5 nsid:e0e0e0e0 cdw10:e0e0e0e0 cdw11:e0e0e0e0 00:07:53.110 [2024-11-29 19:21:12.843840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.110 [2024-11-29 19:21:12.843900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e0) qid:0 cid:6 nsid:e0e0e0e0 cdw10:ffffffff cdw11:ffffffff 00:07:53.110 [2024-11-29 19:21:12.843913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.110 #41 NEW cov: 12537 ft: 15007 corp: 25/2989b lim: 320 exec/s: 41 rss: 73Mb L: 221/225 MS: 1 InsertByte- 00:07:53.110 [2024-11-29 19:21:12.903756] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:bfbfbfbf SGL TRANSPORT DATA BLOCK TRANSPORT 0xbfbfbfbfbfbfbfbf 00:07:53.110 [2024-11-29 19:21:12.903782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.110 [2024-11-29 19:21:12.903842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (bf) qid:0 cid:5 nsid:bfbfbfbf cdw10:00000000 cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x37209401ffffffff 00:07:53.110 [2024-11-29 19:21:12.903856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.110 #42 NEW cov: 12537 ft: 15019 corp: 26/3134b lim: 320 exec/s: 42 rss: 74Mb L: 145/225 MS: 1 InsertRepeatedBytes- 00:07:53.110 [2024-11-29 19:21:12.964100] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:53.110 [2024-11-29 19:21:12.964126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.110 [2024-11-29 19:21:12.964187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e0) qid:0 cid:5 nsid:e0e0e0e0 cdw10:e0e0e0e0 cdw11:e0e0e0e0 00:07:53.110 [2024-11-29 19:21:12.964201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.110 [2024-11-29 19:21:12.964260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e0) qid:0 cid:6 nsid:e0e0e0e0 cdw10:ffffffff cdw11:ffffffff 00:07:53.111 [2024-11-29 19:21:12.964274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.111 #43 NEW cov: 12537 ft: 15031 corp: 27/3364b lim: 320 exec/s: 43 rss: 74Mb L: 230/230 MS: 1 CrossOver- 00:07:53.111 [2024-11-29 19:21:13.003922] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:53.111 [2024-11-29 19:21:13.003947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.369 #44 NEW cov: 12537 ft: 15034 corp: 28/3472b lim: 320 exec/s: 44 rss: 74Mb L: 108/230 MS: 1 ChangeBinInt- 00:07:53.369 [2024-11-29 19:21:13.044017] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x2000000000 00:07:53.369 [2024-11-29 19:21:13.044044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.369 #45 NEW cov: 12537 ft: 15107 corp: 29/3570b lim: 320 exec/s: 45 rss: 74Mb L: 98/230 MS: 1 ChangeBit- 00:07:53.369 [2024-11-29 19:21:13.104241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (31) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.369 [2024-11-29 19:21:13.104267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.369 #46 NEW cov: 12537 ft: 15118 corp: 30/3681b lim: 320 exec/s: 46 rss: 74Mb L: 111/230 MS: 1 CopyPart- 00:07:53.369 [2024-11-29 19:21:13.144592] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:53.369 [2024-11-29 19:21:13.144623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.369 [2024-11-29 19:21:13.144685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e0) qid:0 cid:5 nsid:e0e0e0e0 cdw10:e0e0e0e0 cdw11:e0e0e0e0 00:07:53.369 [2024-11-29 19:21:13.144699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.369 [2024-11-29 19:21:13.144760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e0) qid:0 cid:6 nsid:e0e0e0e0 cdw10:ffffffff cdw11:ffffffff 00:07:53.369 [2024-11-29 19:21:13.144773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.369 #47 NEW cov: 12537 ft: 15150 corp: 31/3902b lim: 320 exec/s: 47 rss: 74Mb L: 221/230 MS: 1 ChangeByte- 00:07:53.369 [2024-11-29 19:21:13.204649] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:53.369 [2024-11-29 19:21:13.204675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.369 [2024-11-29 19:21:13.204737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e0) qid:0 cid:5 nsid:e0e0e0e0 cdw10:e0e0e0e0 cdw11:e0e0e0e0 00:07:53.369 [2024-11-29 19:21:13.204751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.369 #48 NEW cov: 12537 ft: 15185 corp: 32/4056b lim: 320 exec/s: 48 rss: 74Mb L: 154/230 MS: 1 CrossOver- 00:07:53.369 [2024-11-29 19:21:13.244772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:b6b6b6b6 cdw10:b6b6b6b6 cdw11:b6b6b6b6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.369 [2024-11-29 19:21:13.244798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.369 [2024-11-29 19:21:13.244863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b6) qid:0 cid:5 nsid:b6b6b6b6 cdw10:b6b6b6b6 cdw11:b6b6b6b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb6b6b6b6b6b6b6b6 00:07:53.369 [2024-11-29 19:21:13.244877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.629 #49 NEW cov: 12537 ft: 15191 corp: 33/4186b lim: 320 exec/s: 49 rss: 74Mb L: 130/230 MS: 1 ChangeBit- 00:07:53.629 [2024-11-29 19:21:13.304862] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:53.629 [2024-11-29 19:21:13.304888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.629 [2024-11-29 19:21:13.304951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:e0e0e0e0 cdw10:e0e0e0e0 cdw11:e0e0e0e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0xe0e0e0e0e0e0e0e0 00:07:53.629 [2024-11-29 19:21:13.304965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.629 #50 NEW cov: 12537 ft: 15203 corp: 34/4341b lim: 320 exec/s: 50 rss: 74Mb L: 155/230 MS: 1 InsertByte- 00:07:53.629 [2024-11-29 19:21:13.364957] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:53.629 [2024-11-29 19:21:13.364983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.629 #51 NEW cov: 12537 ft: 15243 corp: 35/4449b lim: 320 exec/s: 51 rss: 74Mb L: 108/230 MS: 1 CrossOver- 00:07:53.629 [2024-11-29 19:21:13.425152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (31) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.629 [2024-11-29 19:21:13.425179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.629 #52 NEW cov: 12537 ft: 15249 corp: 36/4544b lim: 320 exec/s: 26 rss: 74Mb L: 95/230 MS: 1 ChangeBit- 00:07:53.629 #52 DONE cov: 12537 ft: 15249 corp: 36/4544b lim: 320 exec/s: 26 rss: 74Mb 00:07:53.629 ###### Recommended dictionary. ###### 00:07:53.629 "\001\224 7\351\242M\350" # Uses: 4 00:07:53.629 ###### End of recommended dictionary. ###### 00:07:53.629 Done 52 runs in 2 second(s) 00:07:53.888 19:21:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_0.conf /var/tmp/suppress_nvmf_fuzz 00:07:53.888 19:21:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:53.888 19:21:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:53.888 19:21:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:53.888 19:21:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:53.888 19:21:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:53.888 19:21:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:53.888 19:21:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:53.888 19:21:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:53.888 19:21:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:53.888 19:21:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:53.888 19:21:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 1 00:07:53.888 19:21:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4401 00:07:53.888 19:21:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:53.888 19:21:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:53.888 19:21:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:53.888 19:21:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:53.888 19:21:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:53.888 19:21:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 00:07:53.888 [2024-11-29 19:21:13.592421] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:53.888 [2024-11-29 19:21:13.592509] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1746315 ] 00:07:53.888 [2024-11-29 19:21:13.781511] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.888 [2024-11-29 19:21:13.794343] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.148 [2024-11-29 19:21:13.846906] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:54.148 [2024-11-29 19:21:13.863281] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:54.148 INFO: Running with entropic power schedule (0xFF, 100). 00:07:54.148 INFO: Seed: 190217764 00:07:54.148 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:07:54.148 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:07:54.148 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:54.148 INFO: A corpus is not provided, starting from an empty corpus 00:07:54.148 #2 INITED exec/s: 0 rss: 64Mb 00:07:54.148 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:54.148 This may also happen if the target rejected all inputs we tried so far 00:07:54.148 [2024-11-29 19:21:13.929457] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11268) > buf size (4096) 00:07:54.148 [2024-11-29 19:21:13.930039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0b000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.148 [2024-11-29 19:21:13.930080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.148 [2024-11-29 19:21:13.930220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.148 [2024-11-29 19:21:13.930241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.407 NEW_FUNC[1/715]: 0x459f48 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:54.407 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:54.407 #6 NEW cov: 12301 ft: 12298 corp: 2/17b lim: 30 exec/s: 0 rss: 72Mb L: 16/16 MS: 4 ChangeByte-CrossOver-ChangeBit-InsertRepeatedBytes- 00:07:54.407 [2024-11-29 19:21:14.280482] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.407 [2024-11-29 19:21:14.280687] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.407 [2024-11-29 19:21:14.281063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.407 [2024-11-29 19:21:14.281101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.407 [2024-11-29 19:21:14.281230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.407 [2024-11-29 19:21:14.281248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.407 NEW_FUNC[1/2]: 0x109cf58 in posix_sock_read /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/sock/posix/posix.c:1527 00:07:54.407 NEW_FUNC[2/2]: 0x2182ce8 in spdk_pipe_writer_get_buffer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/util/pipe.c:92 00:07:54.407 #14 NEW cov: 12471 ft: 12801 corp: 3/33b lim: 30 exec/s: 0 rss: 72Mb L: 16/16 MS: 3 CopyPart-ChangeBit-InsertRepeatedBytes- 00:07:54.666 [2024-11-29 19:21:14.330567] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.666 [2024-11-29 19:21:14.330777] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.666 [2024-11-29 19:21:14.331146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.666 [2024-11-29 19:21:14.331179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.666 [2024-11-29 19:21:14.331309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.666 [2024-11-29 19:21:14.331328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.666 #15 NEW cov: 12477 ft: 13134 corp: 4/49b lim: 30 exec/s: 0 rss: 72Mb L: 16/16 MS: 1 ChangeBinInt- 00:07:54.666 [2024-11-29 19:21:14.400948] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11268) > buf size (4096) 00:07:54.666 [2024-11-29 19:21:14.401855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0b000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.666 [2024-11-29 19:21:14.401884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.666 [2024-11-29 19:21:14.402007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.666 [2024-11-29 19:21:14.402028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.666 [2024-11-29 19:21:14.402164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.666 [2024-11-29 19:21:14.402184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.666 [2024-11-29 19:21:14.402319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.666 [2024-11-29 19:21:14.402339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.666 #16 NEW cov: 12562 ft: 14057 corp: 5/73b lim: 30 exec/s: 0 rss: 72Mb L: 24/24 MS: 1 CrossOver- 00:07:54.666 [2024-11-29 19:21:14.470961] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.666 [2024-11-29 19:21:14.471159] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.666 [2024-11-29 19:21:14.471545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.666 [2024-11-29 19:21:14.471578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.666 [2024-11-29 19:21:14.471706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.666 [2024-11-29 19:21:14.471724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.666 #17 NEW cov: 12562 ft: 14282 corp: 6/89b lim: 30 exec/s: 0 rss: 72Mb L: 16/24 MS: 1 CopyPart- 00:07:54.666 [2024-11-29 19:21:14.541261] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.666 [2024-11-29 19:21:14.541475] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.666 [2024-11-29 19:21:14.541869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.666 [2024-11-29 19:21:14.541903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.666 [2024-11-29 19:21:14.542031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.666 [2024-11-29 19:21:14.542049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.667 #18 NEW cov: 12562 ft: 14396 corp: 7/105b lim: 30 exec/s: 0 rss: 72Mb L: 16/24 MS: 1 ChangeByte- 00:07:54.925 [2024-11-29 19:21:14.591341] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.926 [2024-11-29 19:21:14.591544] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.926 [2024-11-29 19:21:14.591909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.926 [2024-11-29 19:21:14.591940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.926 [2024-11-29 19:21:14.592071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.926 [2024-11-29 19:21:14.592088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.926 #19 NEW cov: 12562 ft: 14526 corp: 8/121b lim: 30 exec/s: 0 rss: 73Mb L: 16/24 MS: 1 ShuffleBytes- 00:07:54.926 [2024-11-29 19:21:14.661769] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.926 [2024-11-29 19:21:14.661954] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff01 00:07:54.926 [2024-11-29 19:21:14.662285] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.926 [2024-11-29 19:21:14.662653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.926 [2024-11-29 19:21:14.662684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.926 [2024-11-29 19:21:14.662810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.926 [2024-11-29 19:21:14.662831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.926 [2024-11-29 19:21:14.662960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.926 [2024-11-29 19:21:14.662982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.926 [2024-11-29 19:21:14.663107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.926 [2024-11-29 19:21:14.663126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.926 #20 NEW cov: 12562 ft: 14585 corp: 9/145b lim: 30 exec/s: 0 rss: 73Mb L: 24/24 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:54.926 [2024-11-29 19:21:14.711710] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000029ff 00:07:54.926 [2024-11-29 19:21:14.711893] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.926 [2024-11-29 19:21:14.712254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.926 [2024-11-29 19:21:14.712284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.926 [2024-11-29 19:21:14.712415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.926 [2024-11-29 19:21:14.712435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.926 #21 NEW cov: 12562 ft: 14655 corp: 10/162b lim: 30 exec/s: 0 rss: 73Mb L: 17/24 MS: 1 InsertByte- 00:07:54.926 [2024-11-29 19:21:14.761859] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.926 [2024-11-29 19:21:14.762037] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.926 [2024-11-29 19:21:14.762413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.926 [2024-11-29 19:21:14.762444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.926 [2024-11-29 19:21:14.762576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.926 [2024-11-29 19:21:14.762593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.926 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:54.926 #22 NEW cov: 12585 ft: 14714 corp: 11/174b lim: 30 exec/s: 0 rss: 73Mb L: 12/24 MS: 1 EraseBytes- 00:07:54.926 [2024-11-29 19:21:14.832093] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff01 00:07:55.185 [2024-11-29 19:21:14.832663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.185 [2024-11-29 19:21:14.832700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.185 [2024-11-29 19:21:14.832837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.185 [2024-11-29 19:21:14.832855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.185 #23 NEW cov: 12585 ft: 14738 corp: 12/190b lim: 30 exec/s: 0 rss: 73Mb L: 16/24 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:55.185 [2024-11-29 19:21:14.882203] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.185 [2024-11-29 19:21:14.882402] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.185 [2024-11-29 19:21:14.882775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.185 [2024-11-29 19:21:14.882806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.185 [2024-11-29 19:21:14.882946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.185 [2024-11-29 19:21:14.882965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.185 #24 NEW cov: 12585 ft: 14747 corp: 13/206b lim: 30 exec/s: 0 rss: 73Mb L: 16/24 MS: 1 CrossOver- 00:07:55.185 [2024-11-29 19:21:14.932567] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.185 [2024-11-29 19:21:14.932749] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff01 00:07:55.185 [2024-11-29 19:21:14.933080] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.185 [2024-11-29 19:21:14.933465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.185 [2024-11-29 19:21:14.933496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.185 [2024-11-29 19:21:14.933626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.185 [2024-11-29 19:21:14.933656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.185 [2024-11-29 19:21:14.933780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.185 [2024-11-29 19:21:14.933798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.185 [2024-11-29 19:21:14.933936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:080183ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.185 [2024-11-29 19:21:14.933956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.185 #25 NEW cov: 12585 ft: 14851 corp: 14/230b lim: 30 exec/s: 25 rss: 73Mb L: 24/24 MS: 1 ChangeBinInt- 00:07:55.186 [2024-11-29 19:21:14.982479] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x2000008f6 00:07:55.186 [2024-11-29 19:21:14.982876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff930220 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.186 [2024-11-29 19:21:14.982906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.186 #28 NEW cov: 12585 ft: 15204 corp: 15/239b lim: 30 exec/s: 28 rss: 73Mb L: 9/24 MS: 3 ChangeByte-CopyPart-CMP- DE: "\377\223 :\010\366G\004"- 00:07:55.186 [2024-11-29 19:21:15.032715] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000029ff 00:07:55.186 [2024-11-29 19:21:15.032907] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000018ff 00:07:55.186 [2024-11-29 19:21:15.033283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.186 [2024-11-29 19:21:15.033312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.186 [2024-11-29 19:21:15.033445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.186 [2024-11-29 19:21:15.033463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.186 #29 NEW cov: 12585 ft: 15211 corp: 16/256b lim: 30 exec/s: 29 rss: 73Mb L: 17/24 MS: 1 ChangeByte- 00:07:55.444 [2024-11-29 19:21:15.102881] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.444 [2024-11-29 19:21:15.103072] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.444 [2024-11-29 19:21:15.103434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.444 [2024-11-29 19:21:15.103463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.444 [2024-11-29 19:21:15.103603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.445 [2024-11-29 19:21:15.103622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.445 #30 NEW cov: 12585 ft: 15275 corp: 17/271b lim: 30 exec/s: 30 rss: 73Mb L: 15/24 MS: 1 EraseBytes- 00:07:55.445 [2024-11-29 19:21:15.173169] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11268) > buf size (4096) 00:07:55.445 [2024-11-29 19:21:15.173730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0b000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.445 [2024-11-29 19:21:15.173759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.445 [2024-11-29 19:21:15.173890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.445 [2024-11-29 19:21:15.173908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.445 #31 NEW cov: 12585 ft: 15292 corp: 18/287b lim: 30 exec/s: 31 rss: 73Mb L: 16/24 MS: 1 ShuffleBytes- 00:07:55.445 [2024-11-29 19:21:15.223383] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11268) > buf size (4096) 00:07:55.445 [2024-11-29 19:21:15.223949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0b000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.445 [2024-11-29 19:21:15.223979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.445 [2024-11-29 19:21:15.224116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.445 [2024-11-29 19:21:15.224135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.445 #32 NEW cov: 12585 ft: 15428 corp: 19/304b lim: 30 exec/s: 32 rss: 73Mb L: 17/24 MS: 1 InsertByte- 00:07:55.445 [2024-11-29 19:21:15.293766] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.445 [2024-11-29 19:21:15.293950] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff01 00:07:55.445 [2024-11-29 19:21:15.294133] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:55.445 [2024-11-29 19:21:15.294303] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xff 00:07:55.445 [2024-11-29 19:21:15.294663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.445 [2024-11-29 19:21:15.294694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.445 [2024-11-29 19:21:15.294831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.445 [2024-11-29 19:21:15.294851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.445 [2024-11-29 19:21:15.294979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.445 [2024-11-29 19:21:15.294997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.445 [2024-11-29 19:21:15.295125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.445 [2024-11-29 19:21:15.295143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.445 #33 NEW cov: 12585 ft: 15459 corp: 20/328b lim: 30 exec/s: 33 rss: 73Mb L: 24/24 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:55.445 [2024-11-29 19:21:15.343765] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000029ff 00:07:55.445 [2024-11-29 19:21:15.343940] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.445 [2024-11-29 19:21:15.344320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.445 [2024-11-29 19:21:15.344350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.445 [2024-11-29 19:21:15.344481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.445 [2024-11-29 19:21:15.344501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.704 #34 NEW cov: 12585 ft: 15477 corp: 21/341b lim: 30 exec/s: 34 rss: 73Mb L: 13/24 MS: 1 EraseBytes- 00:07:55.704 [2024-11-29 19:21:15.393761] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.704 [2024-11-29 19:21:15.394161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.704 [2024-11-29 19:21:15.394192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.704 #35 NEW cov: 12585 ft: 15500 corp: 22/350b lim: 30 exec/s: 35 rss: 73Mb L: 9/24 MS: 1 EraseBytes- 00:07:55.704 [2024-11-29 19:21:15.444047] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.704 [2024-11-29 19:21:15.444260] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.704 [2024-11-29 19:21:15.444650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.704 [2024-11-29 19:21:15.444682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.704 [2024-11-29 19:21:15.444809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.704 [2024-11-29 19:21:15.444832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.704 #36 NEW cov: 12585 ft: 15503 corp: 23/366b lim: 30 exec/s: 36 rss: 73Mb L: 16/24 MS: 1 ChangeByte- 00:07:55.704 [2024-11-29 19:21:15.514290] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11268) > buf size (4096) 00:07:55.704 [2024-11-29 19:21:15.514682] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x86 00:07:55.704 [2024-11-29 19:21:15.515093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0b000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.704 [2024-11-29 19:21:15.515125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.704 [2024-11-29 19:21:15.515256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.704 [2024-11-29 19:21:15.515277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.704 [2024-11-29 19:21:15.515405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:98980098 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.704 [2024-11-29 19:21:15.515422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.704 #37 NEW cov: 12585 ft: 15722 corp: 24/386b lim: 30 exec/s: 37 rss: 73Mb L: 20/24 MS: 1 InsertRepeatedBytes- 00:07:55.704 [2024-11-29 19:21:15.584525] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000ffff 00:07:55.704 [2024-11-29 19:21:15.584924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:dfff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.704 [2024-11-29 19:21:15.584956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.704 #41 NEW cov: 12585 ft: 15749 corp: 25/394b lim: 30 exec/s: 41 rss: 73Mb L: 8/24 MS: 4 InsertRepeatedBytes-ChangeBit-ShuffleBytes-CopyPart- 00:07:55.964 [2024-11-29 19:21:15.634624] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300001aff 00:07:55.964 [2024-11-29 19:21:15.634999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.964 [2024-11-29 19:21:15.635030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.964 #42 NEW cov: 12585 ft: 15769 corp: 26/401b lim: 30 exec/s: 42 rss: 73Mb L: 7/24 MS: 1 CrossOver- 00:07:55.964 [2024-11-29 19:21:15.704900] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.964 [2024-11-29 19:21:15.705289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.964 [2024-11-29 19:21:15.705338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.964 #43 NEW cov: 12585 ft: 15782 corp: 27/410b lim: 30 exec/s: 43 rss: 73Mb L: 9/24 MS: 1 EraseBytes- 00:07:55.964 [2024-11-29 19:21:15.775111] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000293f 00:07:55.964 [2024-11-29 19:21:15.775302] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.964 [2024-11-29 19:21:15.775661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.964 [2024-11-29 19:21:15.775693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.964 [2024-11-29 19:21:15.775823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.964 [2024-11-29 19:21:15.775845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.964 #44 NEW cov: 12585 ft: 15811 corp: 28/424b lim: 30 exec/s: 44 rss: 74Mb L: 14/24 MS: 1 InsertByte- 00:07:55.964 [2024-11-29 19:21:15.845427] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11268) > buf size (4096) 00:07:55.964 [2024-11-29 19:21:15.845779] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x86 00:07:55.964 [2024-11-29 19:21:15.846152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0b000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.964 [2024-11-29 19:21:15.846184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.964 [2024-11-29 19:21:15.846318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.964 [2024-11-29 19:21:15.846338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.964 [2024-11-29 19:21:15.846481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:98980098 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.964 [2024-11-29 19:21:15.846500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.224 #50 NEW cov: 12585 ft: 15823 corp: 29/444b lim: 30 exec/s: 50 rss: 74Mb L: 20/24 MS: 1 ShuffleBytes- 00:07:56.224 [2024-11-29 19:21:15.915497] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000029ff 00:07:56.224 [2024-11-29 19:21:15.915703] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.224 [2024-11-29 19:21:15.916076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.224 [2024-11-29 19:21:15.916106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.224 [2024-11-29 19:21:15.916239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.224 [2024-11-29 19:21:15.916259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.224 #51 NEW cov: 12585 ft: 15856 corp: 30/461b lim: 30 exec/s: 25 rss: 74Mb L: 17/24 MS: 1 ChangeBit- 00:07:56.224 #51 DONE cov: 12585 ft: 15856 corp: 30/461b lim: 30 exec/s: 25 rss: 74Mb 00:07:56.224 ###### Recommended dictionary. ###### 00:07:56.224 "\001\000\000\000\000\000\000\000" # Uses: 2 00:07:56.224 "\377\223 :\010\366G\004" # Uses: 0 00:07:56.224 ###### End of recommended dictionary. ###### 00:07:56.224 Done 51 runs in 2 second(s) 00:07:56.224 19:21:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_1.conf /var/tmp/suppress_nvmf_fuzz 00:07:56.224 19:21:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:56.224 19:21:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:56.224 19:21:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:56.224 19:21:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:56.224 19:21:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:56.224 19:21:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:56.224 19:21:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:56.224 19:21:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:56.224 19:21:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:56.224 19:21:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:56.224 19:21:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 2 00:07:56.224 19:21:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4402 00:07:56.224 19:21:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:56.224 19:21:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:56.224 19:21:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:56.224 19:21:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:56.224 19:21:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:56.224 19:21:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 00:07:56.224 [2024-11-29 19:21:16.082277] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:56.224 [2024-11-29 19:21:16.082356] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1746842 ] 00:07:56.483 [2024-11-29 19:21:16.280808] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.483 [2024-11-29 19:21:16.293219] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:56.483 [2024-11-29 19:21:16.345679] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:56.483 [2024-11-29 19:21:16.361981] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:56.483 INFO: Running with entropic power schedule (0xFF, 100). 00:07:56.483 INFO: Seed: 2687216224 00:07:56.742 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:07:56.742 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:07:56.742 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:56.742 INFO: A corpus is not provided, starting from an empty corpus 00:07:56.742 #2 INITED exec/s: 0 rss: 64Mb 00:07:56.742 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:56.742 This may also happen if the target rejected all inputs we tried so far 00:07:56.742 [2024-11-29 19:21:16.411496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.742 [2024-11-29 19:21:16.411524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.742 [2024-11-29 19:21:16.411583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.742 [2024-11-29 19:21:16.411602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.742 [2024-11-29 19:21:16.411657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.742 [2024-11-29 19:21:16.411671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.001 NEW_FUNC[1/716]: 0x45c9f8 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:57.001 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:57.001 #12 NEW cov: 12274 ft: 12261 corp: 2/23b lim: 35 exec/s: 0 rss: 72Mb L: 22/22 MS: 5 ShuffleBytes-ChangeByte-ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:07:57.001 [2024-11-29 19:21:16.732359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.001 [2024-11-29 19:21:16.732391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.001 [2024-11-29 19:21:16.732452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.001 [2024-11-29 19:21:16.732467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.001 [2024-11-29 19:21:16.732525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.001 [2024-11-29 19:21:16.732540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.001 #28 NEW cov: 12387 ft: 12666 corp: 3/45b lim: 35 exec/s: 0 rss: 72Mb L: 22/22 MS: 1 ShuffleBytes- 00:07:57.001 [2024-11-29 19:21:16.792438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.001 [2024-11-29 19:21:16.792466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.001 [2024-11-29 19:21:16.792526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.001 [2024-11-29 19:21:16.792542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.001 [2024-11-29 19:21:16.792604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.001 [2024-11-29 19:21:16.792618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.001 #29 NEW cov: 12393 ft: 13051 corp: 4/67b lim: 35 exec/s: 0 rss: 72Mb L: 22/22 MS: 1 CopyPart- 00:07:57.001 [2024-11-29 19:21:16.852569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:1a1a001a cdw11:1a001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.001 [2024-11-29 19:21:16.852596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.001 [2024-11-29 19:21:16.852660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1a1a001a cdw11:1a001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.001 [2024-11-29 19:21:16.852675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.001 [2024-11-29 19:21:16.852730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:1a1a001a cdw11:1a001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.001 [2024-11-29 19:21:16.852743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.001 #30 NEW cov: 12478 ft: 13468 corp: 5/92b lim: 35 exec/s: 0 rss: 72Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:07:57.001 [2024-11-29 19:21:16.892433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:1a1a001a cdw11:1a001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.001 [2024-11-29 19:21:16.892460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.261 #31 NEW cov: 12478 ft: 13985 corp: 6/99b lim: 35 exec/s: 0 rss: 72Mb L: 7/25 MS: 1 CrossOver- 00:07:57.261 [2024-11-29 19:21:16.932845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.261 [2024-11-29 19:21:16.932872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.261 [2024-11-29 19:21:16.932933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.261 [2024-11-29 19:21:16.932948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.261 [2024-11-29 19:21:16.933006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff007e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.261 [2024-11-29 19:21:16.933021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.261 #32 NEW cov: 12478 ft: 14095 corp: 7/121b lim: 35 exec/s: 0 rss: 72Mb L: 22/25 MS: 1 ChangeByte- 00:07:57.261 [2024-11-29 19:21:16.972963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.261 [2024-11-29 19:21:16.972990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.261 [2024-11-29 19:21:16.973049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.261 [2024-11-29 19:21:16.973064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.261 [2024-11-29 19:21:16.973119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.261 [2024-11-29 19:21:16.973133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.261 #33 NEW cov: 12478 ft: 14137 corp: 8/147b lim: 35 exec/s: 0 rss: 72Mb L: 26/26 MS: 1 CopyPart- 00:07:57.261 [2024-11-29 19:21:17.033109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.261 [2024-11-29 19:21:17.033135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.261 [2024-11-29 19:21:17.033193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.261 [2024-11-29 19:21:17.033208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.261 [2024-11-29 19:21:17.033263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.261 [2024-11-29 19:21:17.033277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.261 #34 NEW cov: 12478 ft: 14143 corp: 9/169b lim: 35 exec/s: 0 rss: 72Mb L: 22/26 MS: 1 ChangeByte- 00:07:57.261 [2024-11-29 19:21:17.072950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:1a1a001a cdw11:1a001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.261 [2024-11-29 19:21:17.072976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.261 #35 NEW cov: 12478 ft: 14210 corp: 10/179b lim: 35 exec/s: 0 rss: 72Mb L: 10/26 MS: 1 CopyPart- 00:07:57.261 [2024-11-29 19:21:17.133376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.261 [2024-11-29 19:21:17.133401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.261 [2024-11-29 19:21:17.133478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.261 [2024-11-29 19:21:17.133493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.261 [2024-11-29 19:21:17.133554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff007e cdw11:6f00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.261 [2024-11-29 19:21:17.133569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.521 #36 NEW cov: 12478 ft: 14239 corp: 11/201b lim: 35 exec/s: 0 rss: 72Mb L: 22/26 MS: 1 ChangeByte- 00:07:57.521 [2024-11-29 19:21:17.193527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffdf00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.521 [2024-11-29 19:21:17.193553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.521 [2024-11-29 19:21:17.193637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.521 [2024-11-29 19:21:17.193653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.521 [2024-11-29 19:21:17.193717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.521 [2024-11-29 19:21:17.193731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.521 #37 NEW cov: 12478 ft: 14269 corp: 12/223b lim: 35 exec/s: 0 rss: 72Mb L: 22/26 MS: 1 ChangeBit- 00:07:57.521 [2024-11-29 19:21:17.233635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffdf00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.521 [2024-11-29 19:21:17.233661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.521 [2024-11-29 19:21:17.233748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.521 [2024-11-29 19:21:17.233764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.521 [2024-11-29 19:21:17.233823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:3000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.521 [2024-11-29 19:21:17.233839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.521 #38 NEW cov: 12478 ft: 14327 corp: 13/245b lim: 35 exec/s: 0 rss: 73Mb L: 22/26 MS: 1 ChangeByte- 00:07:57.521 [2024-11-29 19:21:17.293542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:1a1a001a cdw11:1f001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.521 [2024-11-29 19:21:17.293568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.521 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:57.521 #39 NEW cov: 12501 ft: 14379 corp: 14/252b lim: 35 exec/s: 0 rss: 73Mb L: 7/26 MS: 1 ChangeBinInt- 00:07:57.521 [2024-11-29 19:21:17.333930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffdf00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.521 [2024-11-29 19:21:17.333956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.521 [2024-11-29 19:21:17.334030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.521 [2024-11-29 19:21:17.334045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.521 [2024-11-29 19:21:17.334103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.521 [2024-11-29 19:21:17.334120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.521 #40 NEW cov: 12501 ft: 14417 corp: 15/275b lim: 35 exec/s: 0 rss: 73Mb L: 23/26 MS: 1 InsertByte- 00:07:57.521 [2024-11-29 19:21:17.373868] ctrlr.c:2752:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:57.521 [2024-11-29 19:21:17.374121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.521 [2024-11-29 19:21:17.374148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.521 [2024-11-29 19:21:17.374207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.521 [2024-11-29 19:21:17.374222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.521 [2024-11-29 19:21:17.374280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff0000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.521 [2024-11-29 19:21:17.374296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.521 #41 NEW cov: 12512 ft: 14524 corp: 16/301b lim: 35 exec/s: 41 rss: 73Mb L: 26/26 MS: 1 CMP- DE: "\000\000"- 00:07:57.781 [2024-11-29 19:21:17.433911] ctrlr.c:2752:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:57.781 [2024-11-29 19:21:17.434275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.781 [2024-11-29 19:21:17.434302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.781 [2024-11-29 19:21:17.434362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.781 [2024-11-29 19:21:17.434379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.781 [2024-11-29 19:21:17.434437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.781 [2024-11-29 19:21:17.434451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.781 #47 NEW cov: 12512 ft: 14563 corp: 17/323b lim: 35 exec/s: 47 rss: 73Mb L: 22/26 MS: 1 PersAutoDict- DE: "\000\000"- 00:07:57.781 [2024-11-29 19:21:17.474330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:1a1a001a cdw11:1a001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.781 [2024-11-29 19:21:17.474356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.781 [2024-11-29 19:21:17.474431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1a1a001a cdw11:1a001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.781 [2024-11-29 19:21:17.474447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.782 [2024-11-29 19:21:17.474503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:1a1a001a cdw11:0a001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.782 [2024-11-29 19:21:17.474517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.782 #48 NEW cov: 12512 ft: 14573 corp: 18/348b lim: 35 exec/s: 48 rss: 73Mb L: 25/26 MS: 1 ChangeBit- 00:07:57.782 [2024-11-29 19:21:17.534054] ctrlr.c:2752:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:57.782 [2024-11-29 19:21:17.534400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:1a1a0000 cdw11:1a001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.782 [2024-11-29 19:21:17.534427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.782 [2024-11-29 19:21:17.534486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1a1a00fc cdw11:00001af5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.782 [2024-11-29 19:21:17.534501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.782 #53 NEW cov: 12512 ft: 14758 corp: 19/363b lim: 35 exec/s: 53 rss: 73Mb L: 15/26 MS: 5 ShuffleBytes-PersAutoDict-PersAutoDict-ChangeBinInt-CrossOver- DE: "\000\000"-"\000\000"- 00:07:57.782 [2024-11-29 19:21:17.574629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff2900ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.782 [2024-11-29 19:21:17.574655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.782 [2024-11-29 19:21:17.574712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.782 [2024-11-29 19:21:17.574726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.782 [2024-11-29 19:21:17.574784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:7eff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.782 [2024-11-29 19:21:17.574798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.782 #54 NEW cov: 12512 ft: 14760 corp: 20/386b lim: 35 exec/s: 54 rss: 73Mb L: 23/26 MS: 1 InsertByte- 00:07:57.782 [2024-11-29 19:21:17.614745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffdf00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.782 [2024-11-29 19:21:17.614771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.782 [2024-11-29 19:21:17.614847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:1a00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.782 [2024-11-29 19:21:17.614861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.782 [2024-11-29 19:21:17.614920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:1aff001a cdw11:ff002cff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.782 [2024-11-29 19:21:17.614933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.782 #60 NEW cov: 12512 ft: 14788 corp: 21/413b lim: 35 exec/s: 60 rss: 73Mb L: 27/27 MS: 1 CrossOver- 00:07:57.782 [2024-11-29 19:21:17.674778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.782 [2024-11-29 19:21:17.674803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.782 [2024-11-29 19:21:17.674879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff007e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.782 [2024-11-29 19:21:17.674894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.042 #61 NEW cov: 12512 ft: 14816 corp: 22/428b lim: 35 exec/s: 61 rss: 73Mb L: 15/27 MS: 1 EraseBytes- 00:07:58.042 [2024-11-29 19:21:17.715046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.042 [2024-11-29 19:21:17.715075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.042 [2024-11-29 19:21:17.715137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.042 [2024-11-29 19:21:17.715151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.042 [2024-11-29 19:21:17.715209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ff0000ff cdw11:ff0000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.042 [2024-11-29 19:21:17.715223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.042 [2024-11-29 19:21:17.755316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.042 [2024-11-29 19:21:17.755341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.042 [2024-11-29 19:21:17.755417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.042 [2024-11-29 19:21:17.755431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.042 [2024-11-29 19:21:17.755492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.042 [2024-11-29 19:21:17.755507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.042 [2024-11-29 19:21:17.755567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:000000ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.042 [2024-11-29 19:21:17.755581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.042 #63 NEW cov: 12512 ft: 15355 corp: 23/458b lim: 35 exec/s: 63 rss: 73Mb L: 30/30 MS: 2 PersAutoDict-InsertRepeatedBytes- DE: "\000\000"- 00:07:58.042 [2024-11-29 19:21:17.794995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.042 [2024-11-29 19:21:17.795021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.042 #64 NEW cov: 12512 ft: 15429 corp: 24/467b lim: 35 exec/s: 64 rss: 73Mb L: 9/30 MS: 1 EraseBytes- 00:07:58.042 [2024-11-29 19:21:17.855384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffdf00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.042 [2024-11-29 19:21:17.855409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.042 [2024-11-29 19:21:17.855471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ff3000ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.042 [2024-11-29 19:21:17.855485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.042 [2024-11-29 19:21:17.855545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff002c cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.042 [2024-11-29 19:21:17.855559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.042 #65 NEW cov: 12512 ft: 15445 corp: 25/491b lim: 35 exec/s: 65 rss: 73Mb L: 24/30 MS: 1 InsertByte- 00:07:58.042 [2024-11-29 19:21:17.895109] ctrlr.c:2752:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:58.042 [2024-11-29 19:21:17.895586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:1a1a0000 cdw11:1a001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.042 [2024-11-29 19:21:17.895623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.042 [2024-11-29 19:21:17.895684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1a1a00fc cdw11:00001af5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.042 [2024-11-29 19:21:17.895698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.042 #66 NEW cov: 12512 ft: 15481 corp: 26/515b lim: 35 exec/s: 66 rss: 73Mb L: 24/30 MS: 1 InsertRepeatedBytes- 00:07:58.302 [2024-11-29 19:21:17.955537] ctrlr.c:2752:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:58.302 [2024-11-29 19:21:17.955898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.302 [2024-11-29 19:21:17.955925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.302 [2024-11-29 19:21:17.955987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:0000ff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.302 [2024-11-29 19:21:17.956002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.302 [2024-11-29 19:21:17.956059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.302 [2024-11-29 19:21:17.956073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.302 [2024-11-29 19:21:17.956131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.302 [2024-11-29 19:21:17.956145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.302 #67 NEW cov: 12512 ft: 15496 corp: 27/546b lim: 35 exec/s: 67 rss: 73Mb L: 31/31 MS: 1 InsertRepeatedBytes- 00:07:58.303 [2024-11-29 19:21:17.995529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.303 [2024-11-29 19:21:17.995554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.303 #68 NEW cov: 12512 ft: 15528 corp: 28/558b lim: 35 exec/s: 68 rss: 73Mb L: 12/31 MS: 1 EraseBytes- 00:07:58.303 [2024-11-29 19:21:18.055719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.303 [2024-11-29 19:21:18.055745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.303 #69 NEW cov: 12512 ft: 15543 corp: 29/570b lim: 35 exec/s: 69 rss: 73Mb L: 12/31 MS: 1 EraseBytes- 00:07:58.303 [2024-11-29 19:21:18.116208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.303 [2024-11-29 19:21:18.116235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.303 [2024-11-29 19:21:18.116295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0060 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.303 [2024-11-29 19:21:18.116309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.303 [2024-11-29 19:21:18.116367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.303 [2024-11-29 19:21:18.116386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.303 #70 NEW cov: 12512 ft: 15554 corp: 30/593b lim: 35 exec/s: 70 rss: 73Mb L: 23/31 MS: 1 InsertByte- 00:07:58.303 [2024-11-29 19:21:18.156041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.303 [2024-11-29 19:21:18.156066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.303 #71 NEW cov: 12512 ft: 15572 corp: 31/605b lim: 35 exec/s: 71 rss: 74Mb L: 12/31 MS: 1 ShuffleBytes- 00:07:58.563 [2024-11-29 19:21:18.216650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.563 [2024-11-29 19:21:18.216676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.563 [2024-11-29 19:21:18.216735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff0040ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.563 [2024-11-29 19:21:18.216760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.563 [2024-11-29 19:21:18.216814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.563 [2024-11-29 19:21:18.216827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.563 [2024-11-29 19:21:18.216885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:000000ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.563 [2024-11-29 19:21:18.216899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.563 #72 NEW cov: 12512 ft: 15586 corp: 32/635b lim: 35 exec/s: 72 rss: 74Mb L: 30/31 MS: 1 ChangeByte- 00:07:58.563 [2024-11-29 19:21:18.276625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.563 [2024-11-29 19:21:18.276652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.563 [2024-11-29 19:21:18.276725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.563 [2024-11-29 19:21:18.276739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.563 [2024-11-29 19:21:18.276798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:c1ff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.563 [2024-11-29 19:21:18.276812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.563 #73 NEW cov: 12512 ft: 15659 corp: 33/657b lim: 35 exec/s: 73 rss: 74Mb L: 22/31 MS: 1 ChangeByte- 00:07:58.563 [2024-11-29 19:21:18.316316] ctrlr.c:2752:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:58.563 [2024-11-29 19:21:18.316678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:1a1a0000 cdw11:1a001a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.563 [2024-11-29 19:21:18.316707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.563 [2024-11-29 19:21:18.316770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1a1a00fc cdw11:00001af5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.563 [2024-11-29 19:21:18.316785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.563 #74 NEW cov: 12512 ft: 15676 corp: 34/672b lim: 35 exec/s: 74 rss: 74Mb L: 15/31 MS: 1 ChangeByte- 00:07:58.563 [2024-11-29 19:21:18.356601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ab cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.563 [2024-11-29 19:21:18.356628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.563 #75 NEW cov: 12512 ft: 15730 corp: 35/684b lim: 35 exec/s: 37 rss: 74Mb L: 12/31 MS: 1 ChangeByte- 00:07:58.563 #75 DONE cov: 12512 ft: 15730 corp: 35/684b lim: 35 exec/s: 37 rss: 74Mb 00:07:58.563 ###### Recommended dictionary. ###### 00:07:58.563 "\000\000" # Uses: 5 00:07:58.563 ###### End of recommended dictionary. ###### 00:07:58.563 Done 75 runs in 2 second(s) 00:07:58.823 19:21:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_2.conf /var/tmp/suppress_nvmf_fuzz 00:07:58.823 19:21:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:58.823 19:21:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:58.823 19:21:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:58.823 19:21:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:58.823 19:21:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:58.823 19:21:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:58.823 19:21:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:58.823 19:21:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:58.823 19:21:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:58.823 19:21:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:58.823 19:21:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 3 00:07:58.823 19:21:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4403 00:07:58.823 19:21:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:58.823 19:21:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:58.823 19:21:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:58.823 19:21:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:58.823 19:21:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:58.823 19:21:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 00:07:58.823 [2024-11-29 19:21:18.546256] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:07:58.823 [2024-11-29 19:21:18.546327] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1747230 ] 00:07:59.083 [2024-11-29 19:21:18.737194] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.083 [2024-11-29 19:21:18.749810] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.083 [2024-11-29 19:21:18.802374] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:59.083 [2024-11-29 19:21:18.818732] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:59.083 INFO: Running with entropic power schedule (0xFF, 100). 00:07:59.083 INFO: Seed: 849258951 00:07:59.083 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:07:59.083 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:07:59.083 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:59.083 INFO: A corpus is not provided, starting from an empty corpus 00:07:59.083 #2 INITED exec/s: 0 rss: 64Mb 00:07:59.083 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:59.083 This may also happen if the target rejected all inputs we tried so far 00:07:59.342 NEW_FUNC[1/705]: 0x45e6d8 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:59.342 NEW_FUNC[2/705]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:59.342 #9 NEW cov: 12183 ft: 12184 corp: 2/11b lim: 20 exec/s: 0 rss: 72Mb L: 10/10 MS: 2 InsertByte-CMP- DE: "\351\205\010\320; \224\000"- 00:07:59.342 #11 NEW cov: 12296 ft: 13020 corp: 3/16b lim: 20 exec/s: 0 rss: 72Mb L: 5/10 MS: 2 ChangeByte-CMP- DE: "\377\377\377\377"- 00:07:59.602 #12 NEW cov: 12319 ft: 13615 corp: 4/35b lim: 20 exec/s: 0 rss: 72Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:59.602 #13 NEW cov: 12404 ft: 13886 corp: 5/53b lim: 20 exec/s: 0 rss: 72Mb L: 18/19 MS: 1 PersAutoDict- DE: "\351\205\010\320; \224\000"- 00:07:59.602 #14 NEW cov: 12408 ft: 14045 corp: 6/66b lim: 20 exec/s: 0 rss: 72Mb L: 13/19 MS: 1 CrossOver- 00:07:59.602 #19 NEW cov: 12408 ft: 14221 corp: 7/76b lim: 20 exec/s: 0 rss: 72Mb L: 10/19 MS: 5 InsertByte-ChangeBinInt-ChangeBinInt-ShuffleBytes-PersAutoDict- DE: "\351\205\010\320; \224\000"- 00:07:59.861 #20 NEW cov: 12408 ft: 14334 corp: 8/86b lim: 20 exec/s: 0 rss: 72Mb L: 10/19 MS: 1 ChangeByte- 00:07:59.861 #21 NEW cov: 12408 ft: 14419 corp: 9/96b lim: 20 exec/s: 0 rss: 72Mb L: 10/19 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:07:59.861 #27 NEW cov: 12408 ft: 14495 corp: 10/109b lim: 20 exec/s: 0 rss: 72Mb L: 13/19 MS: 1 ChangeBinInt- 00:07:59.861 #28 NEW cov: 12408 ft: 14537 corp: 11/115b lim: 20 exec/s: 0 rss: 72Mb L: 6/19 MS: 1 InsertByte- 00:08:00.121 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:00.121 #29 NEW cov: 12431 ft: 14606 corp: 12/134b lim: 20 exec/s: 0 rss: 73Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:08:00.121 #30 NEW cov: 12431 ft: 14640 corp: 13/144b lim: 20 exec/s: 0 rss: 73Mb L: 10/19 MS: 1 ChangeByte- 00:08:00.121 #34 NEW cov: 12431 ft: 14702 corp: 14/158b lim: 20 exec/s: 34 rss: 73Mb L: 14/19 MS: 4 CopyPart-InsertByte-ShuffleBytes-CrossOver- 00:08:00.121 #35 NEW cov: 12431 ft: 14758 corp: 15/168b lim: 20 exec/s: 35 rss: 73Mb L: 10/19 MS: 1 ShuffleBytes- 00:08:00.121 #36 NEW cov: 12431 ft: 14819 corp: 16/173b lim: 20 exec/s: 36 rss: 73Mb L: 5/19 MS: 1 EraseBytes- 00:08:00.380 #42 NEW cov: 12431 ft: 14971 corp: 17/191b lim: 20 exec/s: 42 rss: 73Mb L: 18/19 MS: 1 CMP- DE: "\351z\351z< \224\000"- 00:08:00.380 #45 NEW cov: 12431 ft: 14988 corp: 18/195b lim: 20 exec/s: 45 rss: 73Mb L: 4/19 MS: 3 CrossOver-ChangeByte-InsertByte- 00:08:00.380 #46 NEW cov: 12431 ft: 15030 corp: 19/205b lim: 20 exec/s: 46 rss: 73Mb L: 10/19 MS: 1 ChangeBinInt- 00:08:00.380 #47 NEW cov: 12431 ft: 15067 corp: 20/218b lim: 20 exec/s: 47 rss: 73Mb L: 13/19 MS: 1 PersAutoDict- DE: "\351\205\010\320; \224\000"- 00:08:00.638 #48 NEW cov: 12431 ft: 15085 corp: 21/224b lim: 20 exec/s: 48 rss: 73Mb L: 6/19 MS: 1 ShuffleBytes- 00:08:00.638 #49 NEW cov: 12431 ft: 15113 corp: 22/234b lim: 20 exec/s: 49 rss: 73Mb L: 10/19 MS: 1 PersAutoDict- DE: "\351z\351z< \224\000"- 00:08:00.638 #50 NEW cov: 12431 ft: 15148 corp: 23/244b lim: 20 exec/s: 50 rss: 73Mb L: 10/19 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:00.638 #51 NEW cov: 12431 ft: 15156 corp: 24/263b lim: 20 exec/s: 51 rss: 73Mb L: 19/19 MS: 1 CopyPart- 00:08:00.898 #52 NEW cov: 12431 ft: 15160 corp: 25/277b lim: 20 exec/s: 52 rss: 73Mb L: 14/19 MS: 1 ChangeBinInt- 00:08:00.898 #53 NEW cov: 12431 ft: 15165 corp: 26/293b lim: 20 exec/s: 53 rss: 74Mb L: 16/19 MS: 1 EraseBytes- 00:08:00.898 #54 NEW cov: 12431 ft: 15206 corp: 27/306b lim: 20 exec/s: 54 rss: 74Mb L: 13/19 MS: 1 PersAutoDict- DE: "\351z\351z< \224\000"- 00:08:00.898 #55 NEW cov: 12431 ft: 15214 corp: 28/316b lim: 20 exec/s: 55 rss: 74Mb L: 10/19 MS: 1 ChangeBinInt- 00:08:00.898 #56 NEW cov: 12431 ft: 15216 corp: 29/329b lim: 20 exec/s: 56 rss: 74Mb L: 13/19 MS: 1 ShuffleBytes- 00:08:01.158 #57 NEW cov: 12431 ft: 15228 corp: 30/346b lim: 20 exec/s: 28 rss: 74Mb L: 17/19 MS: 1 InsertByte- 00:08:01.158 #57 DONE cov: 12431 ft: 15228 corp: 30/346b lim: 20 exec/s: 28 rss: 74Mb 00:08:01.158 ###### Recommended dictionary. ###### 00:08:01.158 "\351\205\010\320; \224\000" # Uses: 3 00:08:01.158 "\377\377\377\377" # Uses: 2 00:08:01.158 "\351z\351z< \224\000" # Uses: 2 00:08:01.158 ###### End of recommended dictionary. ###### 00:08:01.158 Done 57 runs in 2 second(s) 00:08:01.158 19:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_3.conf /var/tmp/suppress_nvmf_fuzz 00:08:01.158 19:21:20 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:01.158 19:21:20 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:01.158 19:21:20 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:01.158 19:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:08:01.158 19:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:01.158 19:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:01.158 19:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:01.158 19:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:08:01.158 19:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:01.158 19:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:01.158 19:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 4 00:08:01.158 19:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4404 00:08:01.158 19:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:01.158 19:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:08:01.158 19:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:01.158 19:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:01.158 19:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:01.158 19:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 00:08:01.159 [2024-11-29 19:21:21.016436] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:01.159 [2024-11-29 19:21:21.016505] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1747663 ] 00:08:01.419 [2024-11-29 19:21:21.201175] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.419 [2024-11-29 19:21:21.213471] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.419 [2024-11-29 19:21:21.266208] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:01.419 [2024-11-29 19:21:21.282517] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:08:01.419 INFO: Running with entropic power schedule (0xFF, 100). 00:08:01.419 INFO: Seed: 3312275058 00:08:01.419 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:08:01.419 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:08:01.419 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:01.419 INFO: A corpus is not provided, starting from an empty corpus 00:08:01.419 #2 INITED exec/s: 0 rss: 65Mb 00:08:01.419 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:01.419 This may also happen if the target rejected all inputs we tried so far 00:08:01.678 [2024-11-29 19:21:21.331497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.678 [2024-11-29 19:21:21.331526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.678 [2024-11-29 19:21:21.331585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.678 [2024-11-29 19:21:21.331605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.938 NEW_FUNC[1/717]: 0x45f7d8 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:08:01.938 NEW_FUNC[2/717]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:01.938 #8 NEW cov: 12296 ft: 12295 corp: 2/16b lim: 35 exec/s: 0 rss: 72Mb L: 15/15 MS: 1 InsertRepeatedBytes- 00:08:01.938 [2024-11-29 19:21:21.652547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.938 [2024-11-29 19:21:21.652579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.938 [2024-11-29 19:21:21.652635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.938 [2024-11-29 19:21:21.652650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.938 [2024-11-29 19:21:21.652704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.938 [2024-11-29 19:21:21.652717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.938 [2024-11-29 19:21:21.652772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.938 [2024-11-29 19:21:21.652785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.938 #14 NEW cov: 12409 ft: 13243 corp: 3/48b lim: 35 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:08:01.938 [2024-11-29 19:21:21.692625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.938 [2024-11-29 19:21:21.692653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.938 [2024-11-29 19:21:21.692708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.938 [2024-11-29 19:21:21.692723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.938 [2024-11-29 19:21:21.692777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.938 [2024-11-29 19:21:21.692790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.938 [2024-11-29 19:21:21.692845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.938 [2024-11-29 19:21:21.692858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.938 #15 NEW cov: 12415 ft: 13526 corp: 4/79b lim: 35 exec/s: 0 rss: 72Mb L: 31/32 MS: 1 InsertRepeatedBytes- 00:08:01.938 [2024-11-29 19:21:21.732378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.938 [2024-11-29 19:21:21.732404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.938 [2024-11-29 19:21:21.732460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:000000fc cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.938 [2024-11-29 19:21:21.732474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.938 #16 NEW cov: 12500 ft: 13805 corp: 5/94b lim: 35 exec/s: 0 rss: 72Mb L: 15/32 MS: 1 ChangeBinInt- 00:08:01.938 [2024-11-29 19:21:21.792871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.938 [2024-11-29 19:21:21.792897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.938 [2024-11-29 19:21:21.792968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.938 [2024-11-29 19:21:21.792983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.938 [2024-11-29 19:21:21.793038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.938 [2024-11-29 19:21:21.793052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.938 [2024-11-29 19:21:21.793107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.938 [2024-11-29 19:21:21.793120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.938 #17 NEW cov: 12500 ft: 13915 corp: 6/127b lim: 35 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:01.938 [2024-11-29 19:21:21.832975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.938 [2024-11-29 19:21:21.833001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.938 [2024-11-29 19:21:21.833059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.938 [2024-11-29 19:21:21.833072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.938 [2024-11-29 19:21:21.833127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.938 [2024-11-29 19:21:21.833140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.938 [2024-11-29 19:21:21.833194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:fffffdff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.938 [2024-11-29 19:21:21.833208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.199 #18 NEW cov: 12500 ft: 14088 corp: 7/158b lim: 35 exec/s: 0 rss: 72Mb L: 31/33 MS: 1 ChangeBit- 00:08:02.199 [2024-11-29 19:21:21.893198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.199 [2024-11-29 19:21:21.893226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.199 [2024-11-29 19:21:21.893285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.199 [2024-11-29 19:21:21.893299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.199 [2024-11-29 19:21:21.893353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.199 [2024-11-29 19:21:21.893367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.199 [2024-11-29 19:21:21.893421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:fffff9ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.199 [2024-11-29 19:21:21.893434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.199 #19 NEW cov: 12500 ft: 14172 corp: 8/189b lim: 35 exec/s: 0 rss: 72Mb L: 31/33 MS: 1 ChangeBit- 00:08:02.199 [2024-11-29 19:21:21.952817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.199 [2024-11-29 19:21:21.952842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.199 #20 NEW cov: 12500 ft: 14908 corp: 9/202b lim: 35 exec/s: 0 rss: 72Mb L: 13/33 MS: 1 EraseBytes- 00:08:02.199 [2024-11-29 19:21:21.992950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.199 [2024-11-29 19:21:21.992976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.199 #21 NEW cov: 12500 ft: 14984 corp: 10/215b lim: 35 exec/s: 0 rss: 73Mb L: 13/33 MS: 1 CopyPart- 00:08:02.199 [2024-11-29 19:21:22.053579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000900 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.199 [2024-11-29 19:21:22.053609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.199 [2024-11-29 19:21:22.053665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.199 [2024-11-29 19:21:22.053679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.199 [2024-11-29 19:21:22.053733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.199 [2024-11-29 19:21:22.053746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.199 [2024-11-29 19:21:22.053800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.199 [2024-11-29 19:21:22.053814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.199 #25 NEW cov: 12500 ft: 15048 corp: 11/246b lim: 35 exec/s: 0 rss: 73Mb L: 31/33 MS: 4 ShuffleBytes-ChangeBit-ChangeBit-InsertRepeatedBytes- 00:08:02.199 [2024-11-29 19:21:22.093249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.199 [2024-11-29 19:21:22.093274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.459 #26 NEW cov: 12500 ft: 15153 corp: 12/257b lim: 35 exec/s: 0 rss: 73Mb L: 11/33 MS: 1 EraseBytes- 00:08:02.459 [2024-11-29 19:21:22.133323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.459 [2024-11-29 19:21:22.133352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.459 #32 NEW cov: 12500 ft: 15185 corp: 13/268b lim: 35 exec/s: 0 rss: 73Mb L: 11/33 MS: 1 EraseBytes- 00:08:02.459 [2024-11-29 19:21:22.173980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.459 [2024-11-29 19:21:22.174007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.459 [2024-11-29 19:21:22.174063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.459 [2024-11-29 19:21:22.174077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.459 [2024-11-29 19:21:22.174132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.459 [2024-11-29 19:21:22.174146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.459 [2024-11-29 19:21:22.174201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.459 [2024-11-29 19:21:22.174215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.459 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:02.459 #33 NEW cov: 12523 ft: 15222 corp: 14/300b lim: 35 exec/s: 0 rss: 73Mb L: 32/33 MS: 1 CrossOver- 00:08:02.459 [2024-11-29 19:21:22.234086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.459 [2024-11-29 19:21:22.234112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.459 [2024-11-29 19:21:22.234170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.459 [2024-11-29 19:21:22.234184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.459 [2024-11-29 19:21:22.234239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.459 [2024-11-29 19:21:22.234253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.459 [2024-11-29 19:21:22.234308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:fffff9ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.459 [2024-11-29 19:21:22.234321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.459 #39 NEW cov: 12523 ft: 15260 corp: 15/331b lim: 35 exec/s: 0 rss: 73Mb L: 31/33 MS: 1 ShuffleBytes- 00:08:02.459 [2024-11-29 19:21:22.294258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.459 [2024-11-29 19:21:22.294283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.459 [2024-11-29 19:21:22.294339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ff5bffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.459 [2024-11-29 19:21:22.294353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.459 [2024-11-29 19:21:22.294412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.459 [2024-11-29 19:21:22.294426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.459 [2024-11-29 19:21:22.294495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:fffffff9 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.459 [2024-11-29 19:21:22.294509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.459 #40 NEW cov: 12523 ft: 15313 corp: 16/363b lim: 35 exec/s: 40 rss: 73Mb L: 32/33 MS: 1 InsertByte- 00:08:02.459 [2024-11-29 19:21:22.334370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.459 [2024-11-29 19:21:22.334395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.459 [2024-11-29 19:21:22.334450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.459 [2024-11-29 19:21:22.334464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.459 [2024-11-29 19:21:22.334519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.459 [2024-11-29 19:21:22.334532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.459 [2024-11-29 19:21:22.334588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.459 [2024-11-29 19:21:22.334604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.459 #41 NEW cov: 12523 ft: 15322 corp: 17/395b lim: 35 exec/s: 41 rss: 73Mb L: 32/33 MS: 1 ChangeBit- 00:08:02.719 [2024-11-29 19:21:22.374527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.719 [2024-11-29 19:21:22.374552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.719 [2024-11-29 19:21:22.374613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.719 [2024-11-29 19:21:22.374627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.719 [2024-11-29 19:21:22.374682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.719 [2024-11-29 19:21:22.374695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.719 [2024-11-29 19:21:22.374748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:fffffdff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.719 [2024-11-29 19:21:22.374761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.719 #42 NEW cov: 12523 ft: 15346 corp: 18/429b lim: 35 exec/s: 42 rss: 73Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:08:02.719 [2024-11-29 19:21:22.414624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.719 [2024-11-29 19:21:22.414651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.719 [2024-11-29 19:21:22.414712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00c50000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.719 [2024-11-29 19:21:22.414725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.719 [2024-11-29 19:21:22.414779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:f17f10e0 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.719 [2024-11-29 19:21:22.414792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.719 [2024-11-29 19:21:22.414845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.719 [2024-11-29 19:21:22.414858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.719 #43 NEW cov: 12523 ft: 15351 corp: 19/462b lim: 35 exec/s: 43 rss: 73Mb L: 33/34 MS: 1 CMP- DE: "\305\027\020\340\361\177\000\000"- 00:08:02.719 [2024-11-29 19:21:22.474304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00001500 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.719 [2024-11-29 19:21:22.474329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.719 #44 NEW cov: 12523 ft: 15353 corp: 20/475b lim: 35 exec/s: 44 rss: 73Mb L: 13/34 MS: 1 ChangeByte- 00:08:02.719 [2024-11-29 19:21:22.534960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.719 [2024-11-29 19:21:22.534986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.719 [2024-11-29 19:21:22.535042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.719 [2024-11-29 19:21:22.535056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.719 [2024-11-29 19:21:22.535109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:3df09420 cdw11:55710002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.719 [2024-11-29 19:21:22.535123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.719 [2024-11-29 19:21:22.535179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:fffffdff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.719 [2024-11-29 19:21:22.535192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.719 #45 NEW cov: 12523 ft: 15366 corp: 21/506b lim: 35 exec/s: 45 rss: 73Mb L: 31/34 MS: 1 CMP- DE: "\001\224 =\360Uql"- 00:08:02.719 [2024-11-29 19:21:22.574744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.719 [2024-11-29 19:21:22.574770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.719 [2024-11-29 19:21:22.574825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000a0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.719 [2024-11-29 19:21:22.574839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.719 #46 NEW cov: 12523 ft: 15374 corp: 22/524b lim: 35 exec/s: 46 rss: 73Mb L: 18/34 MS: 1 InsertRepeatedBytes- 00:08:02.719 [2024-11-29 19:21:22.615145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000900 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.719 [2024-11-29 19:21:22.615173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.719 [2024-11-29 19:21:22.615229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.719 [2024-11-29 19:21:22.615243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.719 [2024-11-29 19:21:22.615298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.719 [2024-11-29 19:21:22.615312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.719 [2024-11-29 19:21:22.615365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00fd0000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.719 [2024-11-29 19:21:22.615378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.979 #47 NEW cov: 12523 ft: 15386 corp: 23/555b lim: 35 exec/s: 47 rss: 73Mb L: 31/34 MS: 1 ChangeBinInt- 00:08:02.979 [2024-11-29 19:21:22.675373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0adf cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.979 [2024-11-29 19:21:22.675398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.979 [2024-11-29 19:21:22.675453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.979 [2024-11-29 19:21:22.675466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.979 [2024-11-29 19:21:22.675521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.979 [2024-11-29 19:21:22.675535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.979 [2024-11-29 19:21:22.675587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:fffff9ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.979 [2024-11-29 19:21:22.675604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.979 #48 NEW cov: 12523 ft: 15405 corp: 24/586b lim: 35 exec/s: 48 rss: 73Mb L: 31/34 MS: 1 ChangeBit- 00:08:02.979 [2024-11-29 19:21:22.715610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.979 [2024-11-29 19:21:22.715636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.979 [2024-11-29 19:21:22.715693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.979 [2024-11-29 19:21:22.715707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.979 [2024-11-29 19:21:22.715761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.979 [2024-11-29 19:21:22.715774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.979 [2024-11-29 19:21:22.715827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.979 [2024-11-29 19:21:22.715840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.979 [2024-11-29 19:21:22.715900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:ffff00ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.979 [2024-11-29 19:21:22.715913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:02.979 #49 NEW cov: 12523 ft: 15521 corp: 25/621b lim: 35 exec/s: 49 rss: 73Mb L: 35/35 MS: 1 CrossOver- 00:08:02.979 [2024-11-29 19:21:22.755607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.979 [2024-11-29 19:21:22.755633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.979 [2024-11-29 19:21:22.755689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.979 [2024-11-29 19:21:22.755703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.979 [2024-11-29 19:21:22.755758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:9420ff01 cdw11:3df00002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.979 [2024-11-29 19:21:22.755770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.979 [2024-11-29 19:21:22.755828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:fffffdff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.979 [2024-11-29 19:21:22.755841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.979 #50 NEW cov: 12523 ft: 15539 corp: 26/652b lim: 35 exec/s: 50 rss: 73Mb L: 31/35 MS: 1 CopyPart- 00:08:02.979 [2024-11-29 19:21:22.815751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.979 [2024-11-29 19:21:22.815776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.979 [2024-11-29 19:21:22.815849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:171000c5 cdw11:e0f10002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.979 [2024-11-29 19:21:22.815863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.979 [2024-11-29 19:21:22.815918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.979 [2024-11-29 19:21:22.815932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.979 [2024-11-29 19:21:22.815988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.979 [2024-11-29 19:21:22.816002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.979 #51 NEW cov: 12523 ft: 15562 corp: 27/684b lim: 35 exec/s: 51 rss: 73Mb L: 32/35 MS: 1 PersAutoDict- DE: "\305\027\020\340\361\177\000\000"- 00:08:02.979 [2024-11-29 19:21:22.875921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00001500 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.979 [2024-11-29 19:21:22.875947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.979 [2024-11-29 19:21:22.876020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.979 [2024-11-29 19:21:22.876034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.979 [2024-11-29 19:21:22.876095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.979 [2024-11-29 19:21:22.876108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.979 [2024-11-29 19:21:22.876164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.979 [2024-11-29 19:21:22.876177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.239 [2024-11-29 19:21:22.936070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00001500 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.239 [2024-11-29 19:21:22.936095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.239 [2024-11-29 19:21:22.936153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.239 [2024-11-29 19:21:22.936167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.239 [2024-11-29 19:21:22.936224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.239 [2024-11-29 19:21:22.936237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.239 [2024-11-29 19:21:22.936294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.239 [2024-11-29 19:21:22.936307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.239 #53 NEW cov: 12523 ft: 15582 corp: 28/716b lim: 35 exec/s: 53 rss: 74Mb L: 32/35 MS: 2 InsertRepeatedBytes-ShuffleBytes- 00:08:03.239 [2024-11-29 19:21:22.976151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.239 [2024-11-29 19:21:22.976177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.239 [2024-11-29 19:21:22.976234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:fffff7ff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.239 [2024-11-29 19:21:22.976247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.239 [2024-11-29 19:21:22.976303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:3df09420 cdw11:55710002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.239 [2024-11-29 19:21:22.976317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.239 [2024-11-29 19:21:22.976373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:fffffdff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.239 [2024-11-29 19:21:22.976386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.239 #54 NEW cov: 12523 ft: 15594 corp: 29/747b lim: 35 exec/s: 54 rss: 74Mb L: 31/35 MS: 1 ChangeBit- 00:08:03.239 [2024-11-29 19:21:23.015970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00f70003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.239 [2024-11-29 19:21:23.015997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.239 [2024-11-29 19:21:23.016058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:000a0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.239 [2024-11-29 19:21:23.016072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.239 #55 NEW cov: 12523 ft: 15598 corp: 30/765b lim: 35 exec/s: 55 rss: 74Mb L: 18/35 MS: 1 ChangeBinInt- 00:08:03.239 [2024-11-29 19:21:23.076500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:d6ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.239 [2024-11-29 19:21:23.076526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.239 [2024-11-29 19:21:23.076583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:fffffff7 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.239 [2024-11-29 19:21:23.076601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.239 [2024-11-29 19:21:23.076657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:203d0194 cdw11:f0550002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.239 [2024-11-29 19:21:23.076670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.239 [2024-11-29 19:21:23.076726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffff6cfd cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.239 [2024-11-29 19:21:23.076739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.239 #56 NEW cov: 12523 ft: 15603 corp: 31/797b lim: 35 exec/s: 56 rss: 74Mb L: 32/35 MS: 1 InsertByte- 00:08:03.239 [2024-11-29 19:21:23.136661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00001500 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.239 [2024-11-29 19:21:23.136687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.239 [2024-11-29 19:21:23.136744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.239 [2024-11-29 19:21:23.136757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.239 [2024-11-29 19:21:23.136814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.239 [2024-11-29 19:21:23.136826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.239 [2024-11-29 19:21:23.136882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:000000fd cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.239 [2024-11-29 19:21:23.136895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.499 #57 NEW cov: 12523 ft: 15608 corp: 32/829b lim: 35 exec/s: 57 rss: 74Mb L: 32/35 MS: 1 ChangeBinInt- 00:08:03.499 [2024-11-29 19:21:23.196820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00001500 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.499 [2024-11-29 19:21:23.196846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.499 [2024-11-29 19:21:23.196902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.499 [2024-11-29 19:21:23.196915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.499 [2024-11-29 19:21:23.196973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.499 [2024-11-29 19:21:23.196986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.499 [2024-11-29 19:21:23.197041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:fd000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.499 [2024-11-29 19:21:23.197055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.499 #58 NEW cov: 12523 ft: 15633 corp: 33/862b lim: 35 exec/s: 58 rss: 74Mb L: 33/35 MS: 1 CopyPart- 00:08:03.499 [2024-11-29 19:21:23.257166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-29 19:21:23.257193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.500 [2024-11-29 19:21:23.257249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-29 19:21:23.257262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.500 [2024-11-29 19:21:23.257317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-29 19:21:23.257330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.500 [2024-11-29 19:21:23.257385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-29 19:21:23.257397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.500 [2024-11-29 19:21:23.257451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:ffff00ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-29 19:21:23.257464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.500 #59 NEW cov: 12523 ft: 15642 corp: 34/897b lim: 35 exec/s: 59 rss: 74Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:03.500 [2024-11-29 19:21:23.317185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:01940900 cdw11:203d0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-29 19:21:23.317211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.500 [2024-11-29 19:21:23.317284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:6c005571 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-29 19:21:23.317299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.500 [2024-11-29 19:21:23.317354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-29 19:21:23.317367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.500 [2024-11-29 19:21:23.317422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-29 19:21:23.317436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.500 #60 NEW cov: 12523 ft: 15648 corp: 35/928b lim: 35 exec/s: 30 rss: 74Mb L: 31/35 MS: 1 PersAutoDict- DE: "\001\224 =\360Uql"- 00:08:03.500 #60 DONE cov: 12523 ft: 15648 corp: 35/928b lim: 35 exec/s: 30 rss: 74Mb 00:08:03.500 ###### Recommended dictionary. ###### 00:08:03.500 "\305\027\020\340\361\177\000\000" # Uses: 1 00:08:03.500 "\001\224 =\360Uql" # Uses: 1 00:08:03.500 ###### End of recommended dictionary. ###### 00:08:03.500 Done 60 runs in 2 second(s) 00:08:03.760 19:21:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_4.conf /var/tmp/suppress_nvmf_fuzz 00:08:03.760 19:21:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:03.760 19:21:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:03.760 19:21:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:03.760 19:21:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:08:03.760 19:21:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:03.760 19:21:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:03.760 19:21:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:03.760 19:21:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:08:03.760 19:21:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:03.760 19:21:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:03.760 19:21:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 5 00:08:03.760 19:21:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4405 00:08:03.760 19:21:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:03.760 19:21:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:08:03.760 19:21:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:03.760 19:21:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:03.760 19:21:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:03.760 19:21:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 00:08:03.760 [2024-11-29 19:21:23.485072] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:03.760 [2024-11-29 19:21:23.485142] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1748184 ] 00:08:04.019 [2024-11-29 19:21:23.669085] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:04.019 [2024-11-29 19:21:23.681811] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.019 [2024-11-29 19:21:23.734432] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:04.019 [2024-11-29 19:21:23.750809] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:08:04.019 INFO: Running with entropic power schedule (0xFF, 100). 00:08:04.019 INFO: Seed: 1487297743 00:08:04.019 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:08:04.019 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:08:04.020 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:04.020 INFO: A corpus is not provided, starting from an empty corpus 00:08:04.020 #2 INITED exec/s: 0 rss: 64Mb 00:08:04.020 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:04.020 This may also happen if the target rejected all inputs we tried so far 00:08:04.020 [2024-11-29 19:21:23.806237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.020 [2024-11-29 19:21:23.806268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.279 NEW_FUNC[1/716]: 0x461978 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:08:04.279 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:04.279 #20 NEW cov: 12303 ft: 12282 corp: 2/11b lim: 45 exec/s: 0 rss: 72Mb L: 10/10 MS: 3 ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:08:04.279 [2024-11-29 19:21:24.117106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ff060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.279 [2024-11-29 19:21:24.117140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.279 NEW_FUNC[1/1]: 0x1981b98 in nvme_qpair_check_enabled /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:636 00:08:04.279 #21 NEW cov: 12420 ft: 12910 corp: 3/21b lim: 45 exec/s: 0 rss: 72Mb L: 10/10 MS: 1 ChangeBinInt- 00:08:04.279 [2024-11-29 19:21:24.177136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.279 [2024-11-29 19:21:24.177162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.539 #27 NEW cov: 12426 ft: 13158 corp: 4/36b lim: 45 exec/s: 0 rss: 72Mb L: 15/15 MS: 1 CopyPart- 00:08:04.539 [2024-11-29 19:21:24.217236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:7fff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.539 [2024-11-29 19:21:24.217261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.539 #28 NEW cov: 12511 ft: 13540 corp: 5/46b lim: 45 exec/s: 0 rss: 72Mb L: 10/15 MS: 1 ChangeBit- 00:08:04.539 [2024-11-29 19:21:24.257530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:7fff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.539 [2024-11-29 19:21:24.257555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.539 [2024-11-29 19:21:24.257616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:a97f3327 cdw11:44200004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.539 [2024-11-29 19:21:24.257631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.539 #29 NEW cov: 12511 ft: 14320 corp: 6/64b lim: 45 exec/s: 0 rss: 72Mb L: 18/18 MS: 1 CMP- DE: "3'\251\177D \224\000"- 00:08:04.539 [2024-11-29 19:21:24.317499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ff060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.539 [2024-11-29 19:21:24.317524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.539 #30 NEW cov: 12511 ft: 14486 corp: 7/75b lim: 45 exec/s: 0 rss: 72Mb L: 11/18 MS: 1 InsertByte- 00:08:04.539 [2024-11-29 19:21:24.377654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff830aff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.539 [2024-11-29 19:21:24.377679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.539 #31 NEW cov: 12511 ft: 14554 corp: 8/87b lim: 45 exec/s: 0 rss: 73Mb L: 12/18 MS: 1 InsertByte- 00:08:04.539 [2024-11-29 19:21:24.437871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ff060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.539 [2024-11-29 19:21:24.437898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.799 #32 NEW cov: 12511 ft: 14612 corp: 9/98b lim: 45 exec/s: 0 rss: 73Mb L: 11/18 MS: 1 ShuffleBytes- 00:08:04.799 [2024-11-29 19:21:24.478125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff830aff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.799 [2024-11-29 19:21:24.478151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.799 [2024-11-29 19:21:24.478208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0aff3000 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.799 [2024-11-29 19:21:24.478222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.799 #33 NEW cov: 12511 ft: 14654 corp: 10/124b lim: 45 exec/s: 0 rss: 73Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:08:04.799 [2024-11-29 19:21:24.538124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ff060001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.799 [2024-11-29 19:21:24.538150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.799 #34 NEW cov: 12511 ft: 14700 corp: 11/135b lim: 45 exec/s: 0 rss: 73Mb L: 11/26 MS: 1 InsertByte- 00:08:04.799 [2024-11-29 19:21:24.578240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:27a90a33 cdw11:7f440001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.799 [2024-11-29 19:21:24.578267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.799 #36 NEW cov: 12511 ft: 14772 corp: 12/145b lim: 45 exec/s: 0 rss: 73Mb L: 10/26 MS: 2 CopyPart-PersAutoDict- DE: "3'\251\177D \224\000"- 00:08:04.799 [2024-11-29 19:21:24.618528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d5d50ad5 cdw11:d5d50006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.799 [2024-11-29 19:21:24.618553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.799 [2024-11-29 19:21:24.618612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d5d5d5d5 cdw11:d5d50006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.799 [2024-11-29 19:21:24.618627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.799 #37 NEW cov: 12511 ft: 14790 corp: 13/163b lim: 45 exec/s: 0 rss: 73Mb L: 18/26 MS: 1 InsertRepeatedBytes- 00:08:04.799 [2024-11-29 19:21:24.658450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00ff0a03 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.799 [2024-11-29 19:21:24.658476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.799 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:04.799 #38 NEW cov: 12534 ft: 14837 corp: 14/180b lim: 45 exec/s: 0 rss: 73Mb L: 17/26 MS: 1 CMP- DE: "\003\000"- 00:08:05.059 [2024-11-29 19:21:24.718821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff830aff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.060 [2024-11-29 19:21:24.718847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.060 [2024-11-29 19:21:24.718907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:33273000 cdw11:a97f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.060 [2024-11-29 19:21:24.718921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.060 #39 NEW cov: 12534 ft: 14887 corp: 15/200b lim: 45 exec/s: 0 rss: 73Mb L: 20/26 MS: 1 PersAutoDict- DE: "3'\251\177D \224\000"- 00:08:05.060 [2024-11-29 19:21:24.759093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:7fff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.060 [2024-11-29 19:21:24.759119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.060 [2024-11-29 19:21:24.759178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:a97f3327 cdw11:44200003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.060 [2024-11-29 19:21:24.759192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.060 [2024-11-29 19:21:24.759249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:3327ffff cdw11:a97f0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.060 [2024-11-29 19:21:24.759263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.060 #40 NEW cov: 12534 ft: 15220 corp: 16/232b lim: 45 exec/s: 40 rss: 73Mb L: 32/32 MS: 1 CopyPart- 00:08:05.060 [2024-11-29 19:21:24.819081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff7f0aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.060 [2024-11-29 19:21:24.819107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.060 [2024-11-29 19:21:24.819167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:7f4427a9 cdw11:20940004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.060 [2024-11-29 19:21:24.819181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.060 #41 NEW cov: 12534 ft: 15269 corp: 17/250b lim: 45 exec/s: 41 rss: 73Mb L: 18/32 MS: 1 CopyPart- 00:08:05.060 [2024-11-29 19:21:24.859194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff830aff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.060 [2024-11-29 19:21:24.859219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.060 [2024-11-29 19:21:24.859279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0aff3000 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.060 [2024-11-29 19:21:24.859293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.060 #42 NEW cov: 12534 ft: 15287 corp: 18/276b lim: 45 exec/s: 42 rss: 73Mb L: 26/32 MS: 1 ChangeASCIIInt- 00:08:05.060 [2024-11-29 19:21:24.919170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.060 [2024-11-29 19:21:24.919195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.060 #43 NEW cov: 12534 ft: 15331 corp: 19/290b lim: 45 exec/s: 43 rss: 73Mb L: 14/32 MS: 1 CopyPart- 00:08:05.060 [2024-11-29 19:21:24.959466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0add cdw11:7fff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.060 [2024-11-29 19:21:24.959492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.060 [2024-11-29 19:21:24.959552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:a97f3327 cdw11:44200004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.060 [2024-11-29 19:21:24.959565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.319 #44 NEW cov: 12534 ft: 15360 corp: 20/308b lim: 45 exec/s: 44 rss: 73Mb L: 18/32 MS: 1 ChangeByte- 00:08:05.319 [2024-11-29 19:21:24.999953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff000aff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.319 [2024-11-29 19:21:24.999982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.319 [2024-11-29 19:21:25.000056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.319 [2024-11-29 19:21:25.000070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.319 [2024-11-29 19:21:25.000128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.319 [2024-11-29 19:21:25.000142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.319 [2024-11-29 19:21:25.000197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:ff7f0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.319 [2024-11-29 19:21:25.000211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.319 #45 NEW cov: 12534 ft: 15703 corp: 21/346b lim: 45 exec/s: 45 rss: 73Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:05.319 [2024-11-29 19:21:25.039497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:7fff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.319 [2024-11-29 19:21:25.039523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.319 #46 NEW cov: 12534 ft: 15708 corp: 22/356b lim: 45 exec/s: 46 rss: 73Mb L: 10/38 MS: 1 ChangeBinInt- 00:08:05.319 [2024-11-29 19:21:25.079629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff890aff cdw11:7fff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.319 [2024-11-29 19:21:25.079654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.319 #47 NEW cov: 12534 ft: 15728 corp: 23/366b lim: 45 exec/s: 47 rss: 73Mb L: 10/38 MS: 1 ChangeByte- 00:08:05.319 [2024-11-29 19:21:25.120096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d5d50ad5 cdw11:d5d50006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.319 [2024-11-29 19:21:25.120121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.319 [2024-11-29 19:21:25.120196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:83ffffff cdw11:ff060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.319 [2024-11-29 19:21:25.120211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.319 [2024-11-29 19:21:25.120269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d5d5000a cdw11:d5d50006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.319 [2024-11-29 19:21:25.120283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.319 #48 NEW cov: 12534 ft: 15756 corp: 24/395b lim: 45 exec/s: 48 rss: 73Mb L: 29/38 MS: 1 CrossOver- 00:08:05.319 [2024-11-29 19:21:25.180264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff830aff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.319 [2024-11-29 19:21:25.180289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.319 [2024-11-29 19:21:25.180364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0a0a3000 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.319 [2024-11-29 19:21:25.180378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.319 [2024-11-29 19:21:25.180435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.319 [2024-11-29 19:21:25.180452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.319 #49 NEW cov: 12534 ft: 15787 corp: 25/428b lim: 45 exec/s: 49 rss: 73Mb L: 33/38 MS: 1 CrossOver- 00:08:05.319 [2024-11-29 19:21:25.220041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00ff0aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.319 [2024-11-29 19:21:25.220067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.578 #50 NEW cov: 12534 ft: 15788 corp: 26/443b lim: 45 exec/s: 50 rss: 73Mb L: 15/38 MS: 1 ChangeByte- 00:08:05.578 [2024-11-29 19:21:25.260190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff00ff0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.578 [2024-11-29 19:21:25.260215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.578 #51 NEW cov: 12534 ft: 15802 corp: 27/458b lim: 45 exec/s: 51 rss: 73Mb L: 15/38 MS: 1 ShuffleBytes- 00:08:05.578 [2024-11-29 19:21:25.320716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d5d50ad5 cdw11:d5d50006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.578 [2024-11-29 19:21:25.320742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.578 [2024-11-29 19:21:25.320815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d5d5d5d5 cdw11:d5d50006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.578 [2024-11-29 19:21:25.320829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.578 [2024-11-29 19:21:25.320886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7f4427a9 cdw11:20940000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.578 [2024-11-29 19:21:25.320899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.578 #52 NEW cov: 12534 ft: 15823 corp: 28/486b lim: 45 exec/s: 52 rss: 73Mb L: 28/38 MS: 1 CrossOver- 00:08:05.578 [2024-11-29 19:21:25.360463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:f500ff01 cdw11:00f90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.578 [2024-11-29 19:21:25.360488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.578 #53 NEW cov: 12534 ft: 15862 corp: 29/497b lim: 45 exec/s: 53 rss: 73Mb L: 11/38 MS: 1 ChangeBinInt- 00:08:05.578 [2024-11-29 19:21:25.420971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00ff0a03 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.578 [2024-11-29 19:21:25.420996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.578 [2024-11-29 19:21:25.421056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0a03ffff cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.578 [2024-11-29 19:21:25.421069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.578 [2024-11-29 19:21:25.421127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.578 [2024-11-29 19:21:25.421140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.578 #54 NEW cov: 12534 ft: 15885 corp: 30/530b lim: 45 exec/s: 54 rss: 73Mb L: 33/38 MS: 1 CopyPart- 00:08:05.578 [2024-11-29 19:21:25.480994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff7f0aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.578 [2024-11-29 19:21:25.481023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.578 [2024-11-29 19:21:25.481081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:7f4427a9 cdw11:20940004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.578 [2024-11-29 19:21:25.481095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.838 #55 NEW cov: 12534 ft: 15905 corp: 31/548b lim: 45 exec/s: 55 rss: 74Mb L: 18/38 MS: 1 ChangeByte- 00:08:05.838 [2024-11-29 19:21:25.540999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff010aff cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.838 [2024-11-29 19:21:25.541025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.838 #56 NEW cov: 12534 ft: 15943 corp: 32/562b lim: 45 exec/s: 56 rss: 74Mb L: 14/38 MS: 1 CMP- DE: "\001\000\001\000"- 00:08:05.838 [2024-11-29 19:21:25.601162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff010aff cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.838 [2024-11-29 19:21:25.601188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.838 #57 NEW cov: 12534 ft: 15960 corp: 33/576b lim: 45 exec/s: 57 rss: 74Mb L: 14/38 MS: 1 ChangeBinInt- 00:08:05.838 [2024-11-29 19:21:25.661311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.838 [2024-11-29 19:21:25.661336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.838 #58 NEW cov: 12534 ft: 15962 corp: 34/590b lim: 45 exec/s: 58 rss: 74Mb L: 14/38 MS: 1 ShuffleBytes- 00:08:05.838 [2024-11-29 19:21:25.701377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:7fff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.838 [2024-11-29 19:21:25.701402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.838 #59 NEW cov: 12534 ft: 15970 corp: 35/601b lim: 45 exec/s: 59 rss: 74Mb L: 11/38 MS: 1 CrossOver- 00:08:05.838 [2024-11-29 19:21:25.741739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00ff0aff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.838 [2024-11-29 19:21:25.741765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.838 [2024-11-29 19:21:25.741824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.838 [2024-11-29 19:21:25.741838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.097 #60 NEW cov: 12534 ft: 16026 corp: 36/625b lim: 45 exec/s: 60 rss: 74Mb L: 24/38 MS: 1 CopyPart- 00:08:06.097 [2024-11-29 19:21:25.781639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.097 [2024-11-29 19:21:25.781664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.097 #61 NEW cov: 12534 ft: 16066 corp: 37/635b lim: 45 exec/s: 30 rss: 74Mb L: 10/38 MS: 1 ChangeBinInt- 00:08:06.097 #61 DONE cov: 12534 ft: 16066 corp: 37/635b lim: 45 exec/s: 30 rss: 74Mb 00:08:06.097 ###### Recommended dictionary. ###### 00:08:06.097 "3'\251\177D \224\000" # Uses: 2 00:08:06.097 "\003\000" # Uses: 0 00:08:06.097 "\001\000\001\000" # Uses: 0 00:08:06.097 ###### End of recommended dictionary. ###### 00:08:06.097 Done 61 runs in 2 second(s) 00:08:06.097 19:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_5.conf /var/tmp/suppress_nvmf_fuzz 00:08:06.097 19:21:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:06.097 19:21:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:06.097 19:21:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:06.097 19:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:08:06.097 19:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:06.097 19:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:06.097 19:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:06.097 19:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:08:06.097 19:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:06.097 19:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:06.097 19:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 6 00:08:06.097 19:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4406 00:08:06.097 19:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:06.097 19:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:08:06.097 19:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:06.097 19:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:06.097 19:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:06.097 19:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 00:08:06.097 [2024-11-29 19:21:25.944112] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:06.098 [2024-11-29 19:21:25.944182] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1748485 ] 00:08:06.357 [2024-11-29 19:21:26.132159] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:06.357 [2024-11-29 19:21:26.144951] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.357 [2024-11-29 19:21:26.197523] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:06.357 [2024-11-29 19:21:26.213908] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:08:06.357 INFO: Running with entropic power schedule (0xFF, 100). 00:08:06.357 INFO: Seed: 3950306784 00:08:06.357 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:08:06.357 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:08:06.357 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:06.357 INFO: A corpus is not provided, starting from an empty corpus 00:08:06.357 #2 INITED exec/s: 0 rss: 65Mb 00:08:06.357 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:06.357 This may also happen if the target rejected all inputs we tried so far 00:08:06.616 [2024-11-29 19:21:26.290392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ff cdw11:00000000 00:08:06.616 [2024-11-29 19:21:26.290428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.616 [2024-11-29 19:21:26.290536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:06.616 [2024-11-29 19:21:26.290556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.616 [2024-11-29 19:21:26.290674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:06.616 [2024-11-29 19:21:26.290691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.875 NEW_FUNC[1/715]: 0x464188 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:08:06.875 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:06.875 #4 NEW cov: 12214 ft: 12215 corp: 2/8b lim: 10 exec/s: 0 rss: 72Mb L: 7/7 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:06.875 [2024-11-29 19:21:26.641438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ff cdw11:00000000 00:08:06.875 [2024-11-29 19:21:26.641486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.875 [2024-11-29 19:21:26.641619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000032ff cdw11:00000000 00:08:06.875 [2024-11-29 19:21:26.641642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.875 [2024-11-29 19:21:26.641763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:06.875 [2024-11-29 19:21:26.641784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.875 #5 NEW cov: 12337 ft: 12843 corp: 3/15b lim: 10 exec/s: 0 rss: 72Mb L: 7/7 MS: 1 ChangeByte- 00:08:06.875 [2024-11-29 19:21:26.711452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ff cdw11:00000000 00:08:06.875 [2024-11-29 19:21:26.711479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.875 [2024-11-29 19:21:26.711602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000031ff cdw11:00000000 00:08:06.875 [2024-11-29 19:21:26.711618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.875 [2024-11-29 19:21:26.711740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:06.875 [2024-11-29 19:21:26.711756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.875 #11 NEW cov: 12343 ft: 13034 corp: 4/22b lim: 10 exec/s: 0 rss: 72Mb L: 7/7 MS: 1 ChangeASCIIInt- 00:08:06.875 [2024-11-29 19:21:26.781671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ff cdw11:00000000 00:08:06.876 [2024-11-29 19:21:26.781699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.876 [2024-11-29 19:21:26.781816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:06.876 [2024-11-29 19:21:26.781834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.876 [2024-11-29 19:21:26.781945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:06.876 [2024-11-29 19:21:26.781960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.135 #12 NEW cov: 12428 ft: 13435 corp: 5/29b lim: 10 exec/s: 0 rss: 72Mb L: 7/7 MS: 1 ChangeBit- 00:08:07.135 [2024-11-29 19:21:26.831814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ff cdw11:00000000 00:08:07.135 [2024-11-29 19:21:26.831845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.135 [2024-11-29 19:21:26.831966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002aff cdw11:00000000 00:08:07.135 [2024-11-29 19:21:26.831982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.135 [2024-11-29 19:21:26.832097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:07.135 [2024-11-29 19:21:26.832114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.135 #13 NEW cov: 12428 ft: 13522 corp: 6/36b lim: 10 exec/s: 0 rss: 72Mb L: 7/7 MS: 1 ChangeBinInt- 00:08:07.135 [2024-11-29 19:21:26.882146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000094 cdw11:00000000 00:08:07.135 [2024-11-29 19:21:26.882172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.135 [2024-11-29 19:21:26.882289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002045 cdw11:00000000 00:08:07.135 [2024-11-29 19:21:26.882306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.135 [2024-11-29 19:21:26.882411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ef58 cdw11:00000000 00:08:07.135 [2024-11-29 19:21:26.882427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.135 [2024-11-29 19:21:26.882535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000f80 cdw11:00000000 00:08:07.135 [2024-11-29 19:21:26.882553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.135 #16 NEW cov: 12437 ft: 13826 corp: 7/45b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 3 CopyPart-CrossOver-CMP- DE: "\000\224 E\357X\017\200"- 00:08:07.135 [2024-11-29 19:21:26.931683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ff cdw11:00000000 00:08:07.135 [2024-11-29 19:21:26.931709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.135 #17 NEW cov: 12437 ft: 14102 corp: 8/48b lim: 10 exec/s: 0 rss: 72Mb L: 3/9 MS: 1 CrossOver- 00:08:07.135 [2024-11-29 19:21:27.002274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ff cdw11:00000000 00:08:07.135 [2024-11-29 19:21:27.002303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.135 [2024-11-29 19:21:27.002419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:07.135 [2024-11-29 19:21:27.002437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.135 [2024-11-29 19:21:27.002554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:07.135 [2024-11-29 19:21:27.002571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.395 #18 NEW cov: 12437 ft: 14178 corp: 9/55b lim: 10 exec/s: 0 rss: 73Mb L: 7/9 MS: 1 CrossOver- 00:08:07.395 [2024-11-29 19:21:27.072737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c802 cdw11:00000000 00:08:07.395 [2024-11-29 19:21:27.072767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.395 [2024-11-29 19:21:27.072888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:07.395 [2024-11-29 19:21:27.072909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.395 [2024-11-29 19:21:27.073025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:07.395 [2024-11-29 19:21:27.073042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.395 [2024-11-29 19:21:27.073160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ff7f cdw11:00000000 00:08:07.395 [2024-11-29 19:21:27.073178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.395 #19 NEW cov: 12437 ft: 14215 corp: 10/63b lim: 10 exec/s: 0 rss: 73Mb L: 8/9 MS: 1 InsertByte- 00:08:07.395 [2024-11-29 19:21:27.122237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000202 cdw11:00000000 00:08:07.395 [2024-11-29 19:21:27.122265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.395 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:07.395 #20 NEW cov: 12460 ft: 14286 corp: 11/66b lim: 10 exec/s: 0 rss: 73Mb L: 3/9 MS: 1 CopyPart- 00:08:07.395 [2024-11-29 19:21:27.193348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ff cdw11:00000000 00:08:07.395 [2024-11-29 19:21:27.193376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.395 [2024-11-29 19:21:27.193502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:07.395 [2024-11-29 19:21:27.193520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.395 [2024-11-29 19:21:27.193641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff02 cdw11:00000000 00:08:07.395 [2024-11-29 19:21:27.193658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.395 [2024-11-29 19:21:27.193775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:07.395 [2024-11-29 19:21:27.193791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.395 [2024-11-29 19:21:27.193906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000ff7f cdw11:00000000 00:08:07.395 [2024-11-29 19:21:27.193922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:07.395 #21 NEW cov: 12460 ft: 14371 corp: 12/76b lim: 10 exec/s: 0 rss: 73Mb L: 10/10 MS: 1 CrossOver- 00:08:07.395 [2024-11-29 19:21:27.263300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ff cdw11:00000000 00:08:07.395 [2024-11-29 19:21:27.263328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.395 [2024-11-29 19:21:27.263453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000031ff cdw11:00000000 00:08:07.395 [2024-11-29 19:21:27.263469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.395 [2024-11-29 19:21:27.263583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff96 cdw11:00000000 00:08:07.395 [2024-11-29 19:21:27.263603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.395 [2024-11-29 19:21:27.263723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:07.395 [2024-11-29 19:21:27.263745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.683 #22 NEW cov: 12460 ft: 14402 corp: 13/84b lim: 10 exec/s: 22 rss: 73Mb L: 8/10 MS: 1 InsertByte- 00:08:07.683 [2024-11-29 19:21:27.332840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ff cdw11:00000000 00:08:07.683 [2024-11-29 19:21:27.332868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.683 #23 NEW cov: 12460 ft: 14471 corp: 14/87b lim: 10 exec/s: 23 rss: 73Mb L: 3/10 MS: 1 ChangeByte- 00:08:07.683 [2024-11-29 19:21:27.382996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ff cdw11:00000000 00:08:07.683 [2024-11-29 19:21:27.383024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.683 #24 NEW cov: 12460 ft: 14495 corp: 15/90b lim: 10 exec/s: 24 rss: 73Mb L: 3/10 MS: 1 ShuffleBytes- 00:08:07.683 [2024-11-29 19:21:27.454114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ff cdw11:00000000 00:08:07.683 [2024-11-29 19:21:27.454141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.683 [2024-11-29 19:21:27.454260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:07.683 [2024-11-29 19:21:27.454276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.683 [2024-11-29 19:21:27.454386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:07.683 [2024-11-29 19:21:27.454402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.683 [2024-11-29 19:21:27.454528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:07.683 [2024-11-29 19:21:27.454545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.683 [2024-11-29 19:21:27.454680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000ff02 cdw11:00000000 00:08:07.683 [2024-11-29 19:21:27.454697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:07.683 #30 NEW cov: 12460 ft: 14509 corp: 16/100b lim: 10 exec/s: 30 rss: 73Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:08:07.683 [2024-11-29 19:21:27.503760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ff cdw11:00000000 00:08:07.683 [2024-11-29 19:21:27.503787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.683 [2024-11-29 19:21:27.503894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002aff cdw11:00000000 00:08:07.683 [2024-11-29 19:21:27.503911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.683 [2024-11-29 19:21:27.504028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000f9ff cdw11:00000000 00:08:07.683 [2024-11-29 19:21:27.504045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.683 #31 NEW cov: 12460 ft: 14543 corp: 17/107b lim: 10 exec/s: 31 rss: 73Mb L: 7/10 MS: 1 ChangeBinInt- 00:08:07.683 [2024-11-29 19:21:27.554026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ff cdw11:00000000 00:08:07.683 [2024-11-29 19:21:27.554053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.683 [2024-11-29 19:21:27.554177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:07.683 [2024-11-29 19:21:27.554195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.683 [2024-11-29 19:21:27.554315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:07.683 [2024-11-29 19:21:27.554332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.014 #32 NEW cov: 12460 ft: 14551 corp: 18/114b lim: 10 exec/s: 32 rss: 73Mb L: 7/10 MS: 1 ShuffleBytes- 00:08:08.014 [2024-11-29 19:21:27.603749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ff cdw11:00000000 00:08:08.014 [2024-11-29 19:21:27.603778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.014 #33 NEW cov: 12460 ft: 14574 corp: 19/117b lim: 10 exec/s: 33 rss: 73Mb L: 3/10 MS: 1 ChangeByte- 00:08:08.015 [2024-11-29 19:21:27.654285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.015 [2024-11-29 19:21:27.654314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.015 [2024-11-29 19:21:27.654436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.015 [2024-11-29 19:21:27.654452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.015 [2024-11-29 19:21:27.654573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000002ff cdw11:00000000 00:08:08.015 [2024-11-29 19:21:27.654589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.015 #34 NEW cov: 12460 ft: 14597 corp: 20/124b lim: 10 exec/s: 34 rss: 73Mb L: 7/10 MS: 1 ShuffleBytes- 00:08:08.015 [2024-11-29 19:21:27.724822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002094 cdw11:00000000 00:08:08.015 [2024-11-29 19:21:27.724851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.015 [2024-11-29 19:21:27.724964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002045 cdw11:00000000 00:08:08.015 [2024-11-29 19:21:27.724979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.015 [2024-11-29 19:21:27.725092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ef58 cdw11:00000000 00:08:08.015 [2024-11-29 19:21:27.725106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.015 [2024-11-29 19:21:27.725230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000f80 cdw11:00000000 00:08:08.015 [2024-11-29 19:21:27.725248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.015 #35 NEW cov: 12460 ft: 14615 corp: 21/133b lim: 10 exec/s: 35 rss: 73Mb L: 9/10 MS: 1 ChangeBit- 00:08:08.015 [2024-11-29 19:21:27.794783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.015 [2024-11-29 19:21:27.794812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.015 [2024-11-29 19:21:27.794924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.015 [2024-11-29 19:21:27.794939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.015 [2024-11-29 19:21:27.795059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000002ff cdw11:00000000 00:08:08.015 [2024-11-29 19:21:27.795077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.015 #36 NEW cov: 12460 ft: 14621 corp: 22/140b lim: 10 exec/s: 36 rss: 73Mb L: 7/10 MS: 1 ChangeBit- 00:08:08.015 [2024-11-29 19:21:27.865231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ff cdw11:00000000 00:08:08.015 [2024-11-29 19:21:27.865259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.015 [2024-11-29 19:21:27.865374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000031ff cdw11:00000000 00:08:08.015 [2024-11-29 19:21:27.865392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.015 [2024-11-29 19:21:27.865517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff40 cdw11:00000000 00:08:08.015 [2024-11-29 19:21:27.865534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.015 [2024-11-29 19:21:27.865648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:000096ff cdw11:00000000 00:08:08.015 [2024-11-29 19:21:27.865666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.274 #37 NEW cov: 12460 ft: 14623 corp: 23/149b lim: 10 exec/s: 37 rss: 74Mb L: 9/10 MS: 1 InsertByte- 00:08:08.274 [2024-11-29 19:21:27.935204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ff cdw11:00000000 00:08:08.274 [2024-11-29 19:21:27.935233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.274 [2024-11-29 19:21:27.935351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002aff cdw11:00000000 00:08:08.274 [2024-11-29 19:21:27.935369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.274 [2024-11-29 19:21:27.935483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000027ff cdw11:00000000 00:08:08.274 [2024-11-29 19:21:27.935501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.274 #38 NEW cov: 12460 ft: 14643 corp: 24/156b lim: 10 exec/s: 38 rss: 74Mb L: 7/10 MS: 1 ChangeByte- 00:08:08.274 [2024-11-29 19:21:28.005825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ff cdw11:00000000 00:08:08.274 [2024-11-29 19:21:28.005852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.274 [2024-11-29 19:21:28.005957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.274 [2024-11-29 19:21:28.005973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.274 [2024-11-29 19:21:28.006087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.274 [2024-11-29 19:21:28.006106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.274 [2024-11-29 19:21:28.006226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.274 [2024-11-29 19:21:28.006243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.274 [2024-11-29 19:21:28.006354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000ff02 cdw11:00000000 00:08:08.274 [2024-11-29 19:21:28.006375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:08.274 #39 NEW cov: 12460 ft: 14648 corp: 25/166b lim: 10 exec/s: 39 rss: 74Mb L: 10/10 MS: 1 CMP- DE: "\000\000\000\000"- 00:08:08.274 [2024-11-29 19:21:28.075594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.274 [2024-11-29 19:21:28.075627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.274 [2024-11-29 19:21:28.075737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000efff cdw11:00000000 00:08:08.274 [2024-11-29 19:21:28.075753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.274 [2024-11-29 19:21:28.075864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000002ff cdw11:00000000 00:08:08.274 [2024-11-29 19:21:28.075880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.274 #40 NEW cov: 12460 ft: 14656 corp: 26/173b lim: 10 exec/s: 40 rss: 74Mb L: 7/10 MS: 1 ChangeBit- 00:08:08.274 [2024-11-29 19:21:28.145872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ff cdw11:00000000 00:08:08.274 [2024-11-29 19:21:28.145899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.274 [2024-11-29 19:21:28.146012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002aff cdw11:00000000 00:08:08.274 [2024-11-29 19:21:28.146031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.274 [2024-11-29 19:21:28.146143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.274 [2024-11-29 19:21:28.146160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.274 #41 NEW cov: 12460 ft: 14674 corp: 27/180b lim: 10 exec/s: 41 rss: 74Mb L: 7/10 MS: 1 CopyPart- 00:08:08.533 [2024-11-29 19:21:28.195975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000002ff cdw11:00000000 00:08:08.534 [2024-11-29 19:21:28.196003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.534 [2024-11-29 19:21:28.196117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.534 [2024-11-29 19:21:28.196133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.534 [2024-11-29 19:21:28.196244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.534 [2024-11-29 19:21:28.196261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.534 #42 NEW cov: 12460 ft: 14688 corp: 28/187b lim: 10 exec/s: 42 rss: 74Mb L: 7/10 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:08.534 [2024-11-29 19:21:28.246132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffef cdw11:00000000 00:08:08.534 [2024-11-29 19:21:28.246158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.534 [2024-11-29 19:21:28.246276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000580f cdw11:00000000 00:08:08.534 [2024-11-29 19:21:28.246292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.534 [2024-11-29 19:21:28.246410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000080ff cdw11:00000000 00:08:08.534 [2024-11-29 19:21:28.246427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.534 #43 NEW cov: 12460 ft: 14693 corp: 29/194b lim: 10 exec/s: 21 rss: 74Mb L: 7/10 MS: 1 CrossOver- 00:08:08.534 #43 DONE cov: 12460 ft: 14693 corp: 29/194b lim: 10 exec/s: 21 rss: 74Mb 00:08:08.534 ###### Recommended dictionary. ###### 00:08:08.534 "\000\224 E\357X\017\200" # Uses: 0 00:08:08.534 "\000\000\000\000" # Uses: 1 00:08:08.534 ###### End of recommended dictionary. ###### 00:08:08.534 Done 43 runs in 2 second(s) 00:08:08.534 19:21:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_6.conf /var/tmp/suppress_nvmf_fuzz 00:08:08.534 19:21:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:08.534 19:21:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:08.534 19:21:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:08:08.534 19:21:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:08:08.534 19:21:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:08.534 19:21:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:08.534 19:21:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:08.534 19:21:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:08:08.534 19:21:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:08.534 19:21:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:08.534 19:21:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 7 00:08:08.534 19:21:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4407 00:08:08.534 19:21:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:08.534 19:21:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:08:08.534 19:21:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:08.534 19:21:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:08.534 19:21:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:08.534 19:21:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 00:08:08.534 [2024-11-29 19:21:28.413157] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:08.534 [2024-11-29 19:21:28.413235] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1749022 ] 00:08:08.793 [2024-11-29 19:21:28.594802] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.793 [2024-11-29 19:21:28.607126] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.793 [2024-11-29 19:21:28.659555] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:08.793 [2024-11-29 19:21:28.675896] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:08:08.793 INFO: Running with entropic power schedule (0xFF, 100). 00:08:08.793 INFO: Seed: 2118340677 00:08:09.051 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:08:09.051 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:08:09.051 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:09.051 INFO: A corpus is not provided, starting from an empty corpus 00:08:09.051 #2 INITED exec/s: 0 rss: 65Mb 00:08:09.051 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:09.051 This may also happen if the target rejected all inputs we tried so far 00:08:09.051 [2024-11-29 19:21:28.731244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:09.051 [2024-11-29 19:21:28.731273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.311 NEW_FUNC[1/715]: 0x464b88 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:08:09.311 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:09.311 #5 NEW cov: 12224 ft: 12222 corp: 2/3b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 3 CopyPart-ShuffleBytes-CopyPart- 00:08:09.311 [2024-11-29 19:21:29.042049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000df3a cdw11:00000000 00:08:09.311 [2024-11-29 19:21:29.042089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.311 #8 NEW cov: 12337 ft: 12947 corp: 3/5b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 3 ChangeByte-ChangeBit-InsertByte- 00:08:09.311 [2024-11-29 19:21:29.081993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0b cdw11:00000000 00:08:09.311 [2024-11-29 19:21:29.082020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.311 #9 NEW cov: 12343 ft: 13159 corp: 4/7b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 ChangeBit- 00:08:09.311 [2024-11-29 19:21:29.142489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:08:09.311 [2024-11-29 19:21:29.142515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.311 [2024-11-29 19:21:29.142567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.311 [2024-11-29 19:21:29.142580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.311 [2024-11-29 19:21:29.142631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.311 [2024-11-29 19:21:29.142644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.311 [2024-11-29 19:21:29.142693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.311 [2024-11-29 19:21:29.142706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.311 #12 NEW cov: 12428 ft: 13678 corp: 5/16b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 3 EraseBytes-ChangeBit-CMP- DE: "\000\000\000\000\000\000\000\020"- 00:08:09.311 [2024-11-29 19:21:29.202315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 00:08:09.311 [2024-11-29 19:21:29.202342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.570 #13 NEW cov: 12428 ft: 13798 corp: 6/18b lim: 10 exec/s: 0 rss: 72Mb L: 2/9 MS: 1 CMP- DE: "\000\016"- 00:08:09.570 [2024-11-29 19:21:29.242529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000df5e cdw11:00000000 00:08:09.570 [2024-11-29 19:21:29.242556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.570 [2024-11-29 19:21:29.242606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005e5e cdw11:00000000 00:08:09.570 [2024-11-29 19:21:29.242623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.570 #14 NEW cov: 12428 ft: 14079 corp: 7/23b lim: 10 exec/s: 0 rss: 73Mb L: 5/9 MS: 1 InsertRepeatedBytes- 00:08:09.570 [2024-11-29 19:21:29.302949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000df5e cdw11:00000000 00:08:09.570 [2024-11-29 19:21:29.302976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.570 [2024-11-29 19:21:29.303025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005e00 cdw11:00000000 00:08:09.570 [2024-11-29 19:21:29.303039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.570 [2024-11-29 19:21:29.303090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.570 [2024-11-29 19:21:29.303104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.570 [2024-11-29 19:21:29.303153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000005e cdw11:00000000 00:08:09.570 [2024-11-29 19:21:29.303167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.570 #15 NEW cov: 12428 ft: 14129 corp: 8/32b lim: 10 exec/s: 0 rss: 73Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:08:09.570 [2024-11-29 19:21:29.363140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.570 [2024-11-29 19:21:29.363167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.570 [2024-11-29 19:21:29.363217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.570 [2024-11-29 19:21:29.363230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.570 [2024-11-29 19:21:29.363280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00005e00 cdw11:00000000 00:08:09.570 [2024-11-29 19:21:29.363294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.570 [2024-11-29 19:21:29.363342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000005e cdw11:00000000 00:08:09.570 [2024-11-29 19:21:29.363355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.570 #16 NEW cov: 12428 ft: 14142 corp: 9/41b lim: 10 exec/s: 0 rss: 73Mb L: 9/9 MS: 1 CopyPart- 00:08:09.570 [2024-11-29 19:21:29.423262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:08:09.570 [2024-11-29 19:21:29.423289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.570 [2024-11-29 19:21:29.423339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:08:09.570 [2024-11-29 19:21:29.423353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.570 [2024-11-29 19:21:29.423402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:09.570 [2024-11-29 19:21:29.423416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.570 [2024-11-29 19:21:29.423464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.570 [2024-11-29 19:21:29.423478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.570 #17 NEW cov: 12428 ft: 14175 corp: 10/49b lim: 10 exec/s: 0 rss: 73Mb L: 8/9 MS: 1 CrossOver- 00:08:09.829 [2024-11-29 19:21:29.483417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:08:09.829 [2024-11-29 19:21:29.483444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.829 [2024-11-29 19:21:29.483495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:08:09.829 [2024-11-29 19:21:29.483508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.829 [2024-11-29 19:21:29.483558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:09.829 [2024-11-29 19:21:29.483572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.830 [2024-11-29 19:21:29.483625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000014 cdw11:00000000 00:08:09.830 [2024-11-29 19:21:29.483638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.830 #18 NEW cov: 12428 ft: 14268 corp: 11/58b lim: 10 exec/s: 0 rss: 73Mb L: 9/9 MS: 1 InsertByte- 00:08:09.830 [2024-11-29 19:21:29.543590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.830 [2024-11-29 19:21:29.543621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.830 [2024-11-29 19:21:29.543670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.830 [2024-11-29 19:21:29.543683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.830 [2024-11-29 19:21:29.543732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00005e00 cdw11:00000000 00:08:09.830 [2024-11-29 19:21:29.543745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.830 [2024-11-29 19:21:29.543795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000005a cdw11:00000000 00:08:09.830 [2024-11-29 19:21:29.543807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.830 #19 NEW cov: 12428 ft: 14282 corp: 12/67b lim: 10 exec/s: 0 rss: 73Mb L: 9/9 MS: 1 ChangeBit- 00:08:09.830 [2024-11-29 19:21:29.603405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000e00 cdw11:00000000 00:08:09.830 [2024-11-29 19:21:29.603431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.830 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:09.830 #20 NEW cov: 12451 ft: 14365 corp: 13/69b lim: 10 exec/s: 0 rss: 73Mb L: 2/9 MS: 1 ShuffleBytes- 00:08:09.830 [2024-11-29 19:21:29.663858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:08:09.830 [2024-11-29 19:21:29.663885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.830 [2024-11-29 19:21:29.663933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.830 [2024-11-29 19:21:29.663947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.830 [2024-11-29 19:21:29.663995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.830 [2024-11-29 19:21:29.664012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.830 #21 NEW cov: 12451 ft: 14537 corp: 14/76b lim: 10 exec/s: 0 rss: 73Mb L: 7/9 MS: 1 EraseBytes- 00:08:09.830 [2024-11-29 19:21:29.704060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000df5e cdw11:00000000 00:08:09.830 [2024-11-29 19:21:29.704087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.830 [2024-11-29 19:21:29.704139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005e00 cdw11:00000000 00:08:09.830 [2024-11-29 19:21:29.704153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.830 [2024-11-29 19:21:29.704203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.830 [2024-11-29 19:21:29.704216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.830 [2024-11-29 19:21:29.704263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000058 cdw11:00000000 00:08:09.830 [2024-11-29 19:21:29.704277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.830 #22 NEW cov: 12451 ft: 14553 corp: 15/85b lim: 10 exec/s: 22 rss: 73Mb L: 9/9 MS: 1 ChangeBinInt- 00:08:10.089 [2024-11-29 19:21:29.743933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00009d9d cdw11:00000000 00:08:10.089 [2024-11-29 19:21:29.743959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.089 [2024-11-29 19:21:29.744009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00009d2a cdw11:00000000 00:08:10.089 [2024-11-29 19:21:29.744021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.089 #24 NEW cov: 12451 ft: 14618 corp: 16/89b lim: 10 exec/s: 24 rss: 73Mb L: 4/9 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:10.089 [2024-11-29 19:21:29.784003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000df3a cdw11:00000000 00:08:10.089 [2024-11-29 19:21:29.784029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.089 [2024-11-29 19:21:29.784078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 00:08:10.089 [2024-11-29 19:21:29.784092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.089 #25 NEW cov: 12451 ft: 14635 corp: 17/93b lim: 10 exec/s: 25 rss: 73Mb L: 4/9 MS: 1 PersAutoDict- DE: "\000\016"- 00:08:10.089 [2024-11-29 19:21:29.824277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:10.089 [2024-11-29 19:21:29.824303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.089 [2024-11-29 19:21:29.824349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.089 [2024-11-29 19:21:29.824362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.089 [2024-11-29 19:21:29.824409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000000e cdw11:00000000 00:08:10.089 [2024-11-29 19:21:29.824422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.089 #26 NEW cov: 12451 ft: 14650 corp: 18/99b lim: 10 exec/s: 26 rss: 73Mb L: 6/9 MS: 1 CMP- DE: "\377\377\000\000"- 00:08:10.089 [2024-11-29 19:21:29.864352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000df5e cdw11:00000000 00:08:10.089 [2024-11-29 19:21:29.864378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.089 [2024-11-29 19:21:29.864430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005e00 cdw11:00000000 00:08:10.089 [2024-11-29 19:21:29.864444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.089 [2024-11-29 19:21:29.864493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:08:10.089 [2024-11-29 19:21:29.864507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.089 #27 NEW cov: 12451 ft: 14688 corp: 19/105b lim: 10 exec/s: 27 rss: 73Mb L: 6/9 MS: 1 CrossOver- 00:08:10.089 [2024-11-29 19:21:29.924607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:08:10.089 [2024-11-29 19:21:29.924633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.089 [2024-11-29 19:21:29.924684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.089 [2024-11-29 19:21:29.924697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.089 [2024-11-29 19:21:29.924747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.089 [2024-11-29 19:21:29.924761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.089 [2024-11-29 19:21:29.924809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.089 [2024-11-29 19:21:29.924822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.089 #28 NEW cov: 12451 ft: 14700 corp: 20/114b lim: 10 exec/s: 28 rss: 73Mb L: 9/9 MS: 1 ShuffleBytes- 00:08:10.089 [2024-11-29 19:21:29.964404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004c00 cdw11:00000000 00:08:10.089 [2024-11-29 19:21:29.964430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.089 #29 NEW cov: 12451 ft: 14719 corp: 21/117b lim: 10 exec/s: 29 rss: 73Mb L: 3/9 MS: 1 InsertByte- 00:08:10.348 [2024-11-29 19:21:30.005247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000df3a cdw11:00000000 00:08:10.349 [2024-11-29 19:21:30.005281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.349 [2024-11-29 19:21:30.005341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007f7f cdw11:00000000 00:08:10.349 [2024-11-29 19:21:30.005359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.349 [2024-11-29 19:21:30.005420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007f7f cdw11:00000000 00:08:10.349 [2024-11-29 19:21:30.005440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.349 [2024-11-29 19:21:30.005507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000000e cdw11:00000000 00:08:10.349 [2024-11-29 19:21:30.005534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.349 #30 NEW cov: 12451 ft: 14748 corp: 22/125b lim: 10 exec/s: 30 rss: 74Mb L: 8/9 MS: 1 InsertRepeatedBytes- 00:08:10.349 [2024-11-29 19:21:30.065104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:08:10.349 [2024-11-29 19:21:30.065131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.349 [2024-11-29 19:21:30.065181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:08:10.349 [2024-11-29 19:21:30.065195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.349 [2024-11-29 19:21:30.065244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000fd00 cdw11:00000000 00:08:10.349 [2024-11-29 19:21:30.065258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.349 [2024-11-29 19:21:30.065306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000014 cdw11:00000000 00:08:10.349 [2024-11-29 19:21:30.065319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.349 #31 NEW cov: 12451 ft: 14777 corp: 23/134b lim: 10 exec/s: 31 rss: 74Mb L: 9/9 MS: 1 ChangeBinInt- 00:08:10.349 [2024-11-29 19:21:30.124942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000e0b cdw11:00000000 00:08:10.349 [2024-11-29 19:21:30.124971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.349 #32 NEW cov: 12451 ft: 14787 corp: 24/136b lim: 10 exec/s: 32 rss: 74Mb L: 2/9 MS: 1 ChangeBit- 00:08:10.349 [2024-11-29 19:21:30.165041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000e04 cdw11:00000000 00:08:10.349 [2024-11-29 19:21:30.165067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.349 #33 NEW cov: 12451 ft: 14847 corp: 25/138b lim: 10 exec/s: 33 rss: 74Mb L: 2/9 MS: 1 ChangeBit- 00:08:10.349 [2024-11-29 19:21:30.225538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000df00 cdw11:00000000 00:08:10.349 [2024-11-29 19:21:30.225564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.349 [2024-11-29 19:21:30.225617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.349 [2024-11-29 19:21:30.225631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.349 [2024-11-29 19:21:30.225681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00005e00 cdw11:00000000 00:08:10.349 [2024-11-29 19:21:30.225695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.349 [2024-11-29 19:21:30.225745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00005e5e cdw11:00000000 00:08:10.349 [2024-11-29 19:21:30.225758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.349 #34 NEW cov: 12451 ft: 14870 corp: 26/147b lim: 10 exec/s: 34 rss: 74Mb L: 9/9 MS: 1 ShuffleBytes- 00:08:10.621 [2024-11-29 19:21:30.265534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a1a cdw11:00000000 00:08:10.621 [2024-11-29 19:21:30.265560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.621 [2024-11-29 19:21:30.265615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.621 [2024-11-29 19:21:30.265629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.621 [2024-11-29 19:21:30.265683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a0b cdw11:00000000 00:08:10.621 [2024-11-29 19:21:30.265697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.621 #35 NEW cov: 12451 ft: 14876 corp: 27/153b lim: 10 exec/s: 35 rss: 74Mb L: 6/9 MS: 1 CrossOver- 00:08:10.621 [2024-11-29 19:21:30.305527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000df5e cdw11:00000000 00:08:10.621 [2024-11-29 19:21:30.305552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.621 [2024-11-29 19:21:30.305605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005e5e cdw11:00000000 00:08:10.621 [2024-11-29 19:21:30.305619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.621 #36 NEW cov: 12451 ft: 14891 corp: 28/158b lim: 10 exec/s: 36 rss: 74Mb L: 5/9 MS: 1 ChangeBit- 00:08:10.622 [2024-11-29 19:21:30.345921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.622 [2024-11-29 19:21:30.345947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.622 [2024-11-29 19:21:30.346001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.622 [2024-11-29 19:21:30.346015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.622 [2024-11-29 19:21:30.346065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00005e00 cdw11:00000000 00:08:10.622 [2024-11-29 19:21:30.346079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.622 [2024-11-29 19:21:30.346128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000005e cdw11:00000000 00:08:10.622 [2024-11-29 19:21:30.346141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.622 #37 NEW cov: 12451 ft: 14895 corp: 29/167b lim: 10 exec/s: 37 rss: 74Mb L: 9/9 MS: 1 ChangeBinInt- 00:08:10.622 [2024-11-29 19:21:30.386130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:08:10.622 [2024-11-29 19:21:30.386156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.622 [2024-11-29 19:21:30.386208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:08:10.622 [2024-11-29 19:21:30.386221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.622 [2024-11-29 19:21:30.386286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007ffd cdw11:00000000 00:08:10.622 [2024-11-29 19:21:30.386300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.622 [2024-11-29 19:21:30.386350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.622 [2024-11-29 19:21:30.386363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.622 [2024-11-29 19:21:30.386412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00001400 cdw11:00000000 00:08:10.622 [2024-11-29 19:21:30.386425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.622 #38 NEW cov: 12451 ft: 14939 corp: 30/177b lim: 10 exec/s: 38 rss: 74Mb L: 10/10 MS: 1 CrossOver- 00:08:10.622 [2024-11-29 19:21:30.446154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000479c cdw11:00000000 00:08:10.622 [2024-11-29 19:21:30.446180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.622 [2024-11-29 19:21:30.446231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008f87 cdw11:00000000 00:08:10.622 [2024-11-29 19:21:30.446245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.622 [2024-11-29 19:21:30.446296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004220 cdw11:00000000 00:08:10.622 [2024-11-29 19:21:30.446310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.622 [2024-11-29 19:21:30.446359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00009400 cdw11:00000000 00:08:10.622 [2024-11-29 19:21:30.446373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.622 #39 NEW cov: 12451 ft: 14983 corp: 31/185b lim: 10 exec/s: 39 rss: 74Mb L: 8/10 MS: 1 CMP- DE: "G\234\217\207B \224\000"- 00:08:10.622 [2024-11-29 19:21:30.506369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:08:10.622 [2024-11-29 19:21:30.506395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.622 [2024-11-29 19:21:30.506448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.622 [2024-11-29 19:21:30.506462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.622 [2024-11-29 19:21:30.506513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.622 [2024-11-29 19:21:30.506527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.622 [2024-11-29 19:21:30.506580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000010 cdw11:00000000 00:08:10.623 [2024-11-29 19:21:30.506593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.888 #40 NEW cov: 12451 ft: 14988 corp: 32/194b lim: 10 exec/s: 40 rss: 74Mb L: 9/10 MS: 1 ChangeBit- 00:08:10.888 [2024-11-29 19:21:30.546604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:08:10.888 [2024-11-29 19:21:30.546630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.888 [2024-11-29 19:21:30.546678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.888 [2024-11-29 19:21:30.546691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.888 [2024-11-29 19:21:30.546741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.888 [2024-11-29 19:21:30.546754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.888 [2024-11-29 19:21:30.546804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000010 cdw11:00000000 00:08:10.888 [2024-11-29 19:21:30.546817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.888 [2024-11-29 19:21:30.546865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00007a10 cdw11:00000000 00:08:10.888 [2024-11-29 19:21:30.546881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.888 #41 NEW cov: 12451 ft: 15000 corp: 33/204b lim: 10 exec/s: 41 rss: 74Mb L: 10/10 MS: 1 InsertByte- 00:08:10.888 [2024-11-29 19:21:30.606606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.888 [2024-11-29 19:21:30.606642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.888 [2024-11-29 19:21:30.606694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.888 [2024-11-29 19:21:30.606707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.888 [2024-11-29 19:21:30.606758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00005eff cdw11:00000000 00:08:10.888 [2024-11-29 19:21:30.606772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.888 [2024-11-29 19:21:30.606821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000005e cdw11:00000000 00:08:10.889 [2024-11-29 19:21:30.606834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.889 #42 NEW cov: 12451 ft: 15010 corp: 34/213b lim: 10 exec/s: 42 rss: 74Mb L: 9/10 MS: 1 ChangeByte- 00:08:10.889 [2024-11-29 19:21:30.646751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000900 cdw11:00000000 00:08:10.889 [2024-11-29 19:21:30.646778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.889 [2024-11-29 19:21:30.646830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.889 [2024-11-29 19:21:30.646844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.889 [2024-11-29 19:21:30.646895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00005e00 cdw11:00000000 00:08:10.889 [2024-11-29 19:21:30.646909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.889 [2024-11-29 19:21:30.646958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00005e5e cdw11:00000000 00:08:10.889 [2024-11-29 19:21:30.646971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.889 #43 NEW cov: 12451 ft: 15017 corp: 35/222b lim: 10 exec/s: 43 rss: 74Mb L: 9/10 MS: 1 ChangeBinInt- 00:08:10.889 [2024-11-29 19:21:30.707035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000e6f8 cdw11:00000000 00:08:10.889 [2024-11-29 19:21:30.707062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.889 [2024-11-29 19:21:30.707115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.889 [2024-11-29 19:21:30.707129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.889 [2024-11-29 19:21:30.707181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.889 [2024-11-29 19:21:30.707195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.889 [2024-11-29 19:21:30.707248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000010 cdw11:00000000 00:08:10.889 [2024-11-29 19:21:30.707261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.889 [2024-11-29 19:21:30.707316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00007a10 cdw11:00000000 00:08:10.889 [2024-11-29 19:21:30.707330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.889 #44 NEW cov: 12451 ft: 15024 corp: 36/232b lim: 10 exec/s: 22 rss: 74Mb L: 10/10 MS: 1 ChangeBinInt- 00:08:10.889 #44 DONE cov: 12451 ft: 15024 corp: 36/232b lim: 10 exec/s: 22 rss: 74Mb 00:08:10.889 ###### Recommended dictionary. ###### 00:08:10.889 "\000\000\000\000\000\000\000\020" # Uses: 0 00:08:10.889 "\000\016" # Uses: 1 00:08:10.889 "\377\377\000\000" # Uses: 0 00:08:10.889 "G\234\217\207B \224\000" # Uses: 0 00:08:10.889 ###### End of recommended dictionary. ###### 00:08:10.889 Done 44 runs in 2 second(s) 00:08:11.148 19:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_7.conf /var/tmp/suppress_nvmf_fuzz 00:08:11.148 19:21:30 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:11.148 19:21:30 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:11.148 19:21:30 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:08:11.148 19:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:08:11.148 19:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:11.148 19:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:11.148 19:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:11.148 19:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:08:11.148 19:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:11.148 19:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:11.148 19:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 8 00:08:11.148 19:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4408 00:08:11.148 19:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:11.148 19:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:08:11.148 19:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:11.148 19:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:11.148 19:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:11.148 19:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 00:08:11.148 [2024-11-29 19:21:30.896228] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:11.148 [2024-11-29 19:21:30.896299] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1749374 ] 00:08:11.407 [2024-11-29 19:21:31.079701] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.407 [2024-11-29 19:21:31.092191] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.407 [2024-11-29 19:21:31.144732] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:11.407 [2024-11-29 19:21:31.161112] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:08:11.407 INFO: Running with entropic power schedule (0xFF, 100). 00:08:11.407 INFO: Seed: 308354359 00:08:11.407 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:08:11.407 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:08:11.407 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:11.407 INFO: A corpus is not provided, starting from an empty corpus 00:08:11.407 [2024-11-29 19:21:31.226466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.407 [2024-11-29 19:21:31.226496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.407 #2 INITED cov: 12231 ft: 12232 corp: 1/1b exec/s: 0 rss: 71Mb 00:08:11.407 [2024-11-29 19:21:31.266502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.407 [2024-11-29 19:21:31.266529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.407 #3 NEW cov: 12363 ft: 12753 corp: 2/2b lim: 5 exec/s: 0 rss: 71Mb L: 1/1 MS: 1 ChangeByte- 00:08:11.665 [2024-11-29 19:21:31.326689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.665 [2024-11-29 19:21:31.326716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.665 #4 NEW cov: 12369 ft: 13062 corp: 3/3b lim: 5 exec/s: 0 rss: 71Mb L: 1/1 MS: 1 CopyPart- 00:08:11.665 [2024-11-29 19:21:31.367402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.665 [2024-11-29 19:21:31.367429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.665 [2024-11-29 19:21:31.367485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.665 [2024-11-29 19:21:31.367499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.666 [2024-11-29 19:21:31.367554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.666 [2024-11-29 19:21:31.367568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.666 [2024-11-29 19:21:31.367630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.666 [2024-11-29 19:21:31.367643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.666 [2024-11-29 19:21:31.367695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.666 [2024-11-29 19:21:31.367709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.666 #5 NEW cov: 12454 ft: 14153 corp: 4/8b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:11.666 [2024-11-29 19:21:31.426962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.666 [2024-11-29 19:21:31.426989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.666 #6 NEW cov: 12454 ft: 14240 corp: 5/9b lim: 5 exec/s: 0 rss: 71Mb L: 1/5 MS: 1 ChangeBit- 00:08:11.666 [2024-11-29 19:21:31.487094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.666 [2024-11-29 19:21:31.487121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.666 #7 NEW cov: 12454 ft: 14305 corp: 6/10b lim: 5 exec/s: 0 rss: 71Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:11.666 [2024-11-29 19:21:31.547254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.666 [2024-11-29 19:21:31.547280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.666 #8 NEW cov: 12454 ft: 14356 corp: 7/11b lim: 5 exec/s: 0 rss: 71Mb L: 1/5 MS: 1 ChangeByte- 00:08:11.924 [2024-11-29 19:21:31.587378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.924 [2024-11-29 19:21:31.587405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.924 #9 NEW cov: 12454 ft: 14425 corp: 8/12b lim: 5 exec/s: 0 rss: 71Mb L: 1/5 MS: 1 ChangeBinInt- 00:08:11.924 [2024-11-29 19:21:31.647531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.924 [2024-11-29 19:21:31.647558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.924 #10 NEW cov: 12454 ft: 14436 corp: 9/13b lim: 5 exec/s: 0 rss: 71Mb L: 1/5 MS: 1 ChangeBit- 00:08:11.924 [2024-11-29 19:21:31.687669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.924 [2024-11-29 19:21:31.687696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.924 #11 NEW cov: 12454 ft: 14490 corp: 10/14b lim: 5 exec/s: 0 rss: 72Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:11.924 [2024-11-29 19:21:31.747854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.924 [2024-11-29 19:21:31.747881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.924 #12 NEW cov: 12454 ft: 14516 corp: 11/15b lim: 5 exec/s: 0 rss: 72Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:11.924 [2024-11-29 19:21:31.808632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.924 [2024-11-29 19:21:31.808658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.924 [2024-11-29 19:21:31.808715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.924 [2024-11-29 19:21:31.808730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.924 [2024-11-29 19:21:31.808785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.924 [2024-11-29 19:21:31.808798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.924 [2024-11-29 19:21:31.808855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.924 [2024-11-29 19:21:31.808868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.924 [2024-11-29 19:21:31.808921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.925 [2024-11-29 19:21:31.808934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.183 #13 NEW cov: 12454 ft: 14531 corp: 12/20b lim: 5 exec/s: 0 rss: 72Mb L: 5/5 MS: 1 ChangeByte- 00:08:12.183 [2024-11-29 19:21:31.868179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.183 [2024-11-29 19:21:31.868206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.183 #14 NEW cov: 12454 ft: 14549 corp: 13/21b lim: 5 exec/s: 0 rss: 72Mb L: 1/5 MS: 1 CrossOver- 00:08:12.183 [2024-11-29 19:21:31.908265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.183 [2024-11-29 19:21:31.908291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.183 #15 NEW cov: 12454 ft: 14556 corp: 14/22b lim: 5 exec/s: 0 rss: 72Mb L: 1/5 MS: 1 ChangeByte- 00:08:12.183 [2024-11-29 19:21:31.968622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.183 [2024-11-29 19:21:31.968649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.183 [2024-11-29 19:21:31.968704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.183 [2024-11-29 19:21:31.968718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.183 #16 NEW cov: 12454 ft: 14759 corp: 15/24b lim: 5 exec/s: 0 rss: 72Mb L: 2/5 MS: 1 CrossOver- 00:08:12.183 [2024-11-29 19:21:32.028623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.183 [2024-11-29 19:21:32.028649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.183 #17 NEW cov: 12454 ft: 14830 corp: 16/25b lim: 5 exec/s: 0 rss: 72Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:12.183 [2024-11-29 19:21:32.069344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.183 [2024-11-29 19:21:32.069370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.183 [2024-11-29 19:21:32.069426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.183 [2024-11-29 19:21:32.069441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.183 [2024-11-29 19:21:32.069496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.183 [2024-11-29 19:21:32.069510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.183 [2024-11-29 19:21:32.069566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.183 [2024-11-29 19:21:32.069579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.183 [2024-11-29 19:21:32.069639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.183 [2024-11-29 19:21:32.069653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.701 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:12.701 #18 NEW cov: 12477 ft: 14900 corp: 17/30b lim: 5 exec/s: 18 rss: 73Mb L: 5/5 MS: 1 CrossOver- 00:08:12.701 [2024-11-29 19:21:32.400165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.701 [2024-11-29 19:21:32.400239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.701 [2024-11-29 19:21:32.400345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.701 [2024-11-29 19:21:32.400382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.701 #19 NEW cov: 12477 ft: 15102 corp: 18/32b lim: 5 exec/s: 19 rss: 73Mb L: 2/5 MS: 1 InsertByte- 00:08:12.701 [2024-11-29 19:21:32.450185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.701 [2024-11-29 19:21:32.450213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.701 [2024-11-29 19:21:32.450272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.701 [2024-11-29 19:21:32.450286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.701 [2024-11-29 19:21:32.450345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.701 [2024-11-29 19:21:32.450360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.701 [2024-11-29 19:21:32.450418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.701 [2024-11-29 19:21:32.450431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.701 [2024-11-29 19:21:32.450488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.701 [2024-11-29 19:21:32.450501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.701 #20 NEW cov: 12477 ft: 15143 corp: 19/37b lim: 5 exec/s: 20 rss: 73Mb L: 5/5 MS: 1 ChangeBit- 00:08:12.701 [2024-11-29 19:21:32.490424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.701 [2024-11-29 19:21:32.490452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.701 [2024-11-29 19:21:32.490509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.701 [2024-11-29 19:21:32.490523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.701 [2024-11-29 19:21:32.490581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.701 [2024-11-29 19:21:32.490595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.701 [2024-11-29 19:21:32.490655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.701 [2024-11-29 19:21:32.490671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.701 [2024-11-29 19:21:32.490726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.701 [2024-11-29 19:21:32.490740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.701 #21 NEW cov: 12477 ft: 15164 corp: 20/42b lim: 5 exec/s: 21 rss: 73Mb L: 5/5 MS: 1 CopyPart- 00:08:12.701 [2024-11-29 19:21:32.529915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.701 [2024-11-29 19:21:32.529943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.701 #22 NEW cov: 12477 ft: 15186 corp: 21/43b lim: 5 exec/s: 22 rss: 74Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:12.701 [2024-11-29 19:21:32.590057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.701 [2024-11-29 19:21:32.590085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.960 #23 NEW cov: 12477 ft: 15214 corp: 22/44b lim: 5 exec/s: 23 rss: 74Mb L: 1/5 MS: 1 ChangeByte- 00:08:12.960 [2024-11-29 19:21:32.630803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.960 [2024-11-29 19:21:32.630830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.960 [2024-11-29 19:21:32.630888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.960 [2024-11-29 19:21:32.630902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.960 [2024-11-29 19:21:32.630961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.960 [2024-11-29 19:21:32.630975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.960 [2024-11-29 19:21:32.631032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.960 [2024-11-29 19:21:32.631045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.960 [2024-11-29 19:21:32.631102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.960 [2024-11-29 19:21:32.631116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.960 #24 NEW cov: 12477 ft: 15226 corp: 23/49b lim: 5 exec/s: 24 rss: 74Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:12.960 [2024-11-29 19:21:32.690328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.960 [2024-11-29 19:21:32.690356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.960 #25 NEW cov: 12477 ft: 15233 corp: 24/50b lim: 5 exec/s: 25 rss: 74Mb L: 1/5 MS: 1 CrossOver- 00:08:12.960 [2024-11-29 19:21:32.750477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.960 [2024-11-29 19:21:32.750508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.960 #26 NEW cov: 12477 ft: 15237 corp: 25/51b lim: 5 exec/s: 26 rss: 74Mb L: 1/5 MS: 1 ChangeBit- 00:08:12.960 [2024-11-29 19:21:32.790606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.960 [2024-11-29 19:21:32.790634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.960 #27 NEW cov: 12477 ft: 15257 corp: 26/52b lim: 5 exec/s: 27 rss: 74Mb L: 1/5 MS: 1 ChangeBit- 00:08:12.960 [2024-11-29 19:21:32.830709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.960 [2024-11-29 19:21:32.830735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.960 #28 NEW cov: 12477 ft: 15295 corp: 27/53b lim: 5 exec/s: 28 rss: 74Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:13.219 [2024-11-29 19:21:32.871007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.219 [2024-11-29 19:21:32.871034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.219 [2024-11-29 19:21:32.871092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.219 [2024-11-29 19:21:32.871106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.219 #29 NEW cov: 12477 ft: 15296 corp: 28/55b lim: 5 exec/s: 29 rss: 74Mb L: 2/5 MS: 1 ShuffleBytes- 00:08:13.219 [2024-11-29 19:21:32.931611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.219 [2024-11-29 19:21:32.931637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.219 [2024-11-29 19:21:32.931696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.219 [2024-11-29 19:21:32.931710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.219 [2024-11-29 19:21:32.931766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.219 [2024-11-29 19:21:32.931780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.219 [2024-11-29 19:21:32.931834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.219 [2024-11-29 19:21:32.931849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.219 [2024-11-29 19:21:32.931905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.219 [2024-11-29 19:21:32.931919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:13.219 #30 NEW cov: 12477 ft: 15311 corp: 29/60b lim: 5 exec/s: 30 rss: 74Mb L: 5/5 MS: 1 ShuffleBytes- 00:08:13.219 [2024-11-29 19:21:32.971107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.219 [2024-11-29 19:21:32.971133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.219 #31 NEW cov: 12477 ft: 15325 corp: 30/61b lim: 5 exec/s: 31 rss: 74Mb L: 1/5 MS: 1 ChangeBit- 00:08:13.219 [2024-11-29 19:21:33.011254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.219 [2024-11-29 19:21:33.011280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.219 #32 NEW cov: 12477 ft: 15337 corp: 31/62b lim: 5 exec/s: 32 rss: 74Mb L: 1/5 MS: 1 CopyPart- 00:08:13.219 [2024-11-29 19:21:33.071875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.219 [2024-11-29 19:21:33.071902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.219 [2024-11-29 19:21:33.071958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.219 [2024-11-29 19:21:33.071972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.219 [2024-11-29 19:21:33.072028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.219 [2024-11-29 19:21:33.072042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.219 [2024-11-29 19:21:33.072100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.219 [2024-11-29 19:21:33.072114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.219 #33 NEW cov: 12477 ft: 15354 corp: 32/66b lim: 5 exec/s: 33 rss: 74Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:08:13.479 [2024-11-29 19:21:33.131685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.479 [2024-11-29 19:21:33.131712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.479 [2024-11-29 19:21:33.131770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.479 [2024-11-29 19:21:33.131785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.479 #34 NEW cov: 12477 ft: 15391 corp: 33/68b lim: 5 exec/s: 34 rss: 74Mb L: 2/5 MS: 1 InsertByte- 00:08:13.479 [2024-11-29 19:21:33.171653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.479 [2024-11-29 19:21:33.171679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.479 #35 NEW cov: 12477 ft: 15397 corp: 34/69b lim: 5 exec/s: 35 rss: 74Mb L: 1/5 MS: 1 ChangeByte- 00:08:13.479 [2024-11-29 19:21:33.211964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.479 [2024-11-29 19:21:33.211990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.479 [2024-11-29 19:21:33.212046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.479 [2024-11-29 19:21:33.212060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.479 #36 NEW cov: 12477 ft: 15426 corp: 35/71b lim: 5 exec/s: 18 rss: 74Mb L: 2/5 MS: 1 CrossOver- 00:08:13.479 #36 DONE cov: 12477 ft: 15426 corp: 35/71b lim: 5 exec/s: 18 rss: 74Mb 00:08:13.479 Done 36 runs in 2 second(s) 00:08:13.479 19:21:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_8.conf /var/tmp/suppress_nvmf_fuzz 00:08:13.479 19:21:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:13.479 19:21:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:13.479 19:21:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:08:13.479 19:21:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:08:13.479 19:21:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:13.479 19:21:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:13.479 19:21:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:13.479 19:21:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:08:13.479 19:21:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:13.479 19:21:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:13.479 19:21:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 9 00:08:13.479 19:21:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4409 00:08:13.479 19:21:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:13.479 19:21:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:08:13.479 19:21:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:13.479 19:21:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:13.479 19:21:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:13.479 19:21:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 00:08:13.739 [2024-11-29 19:21:33.401133] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:13.739 [2024-11-29 19:21:33.401204] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1749842 ] 00:08:13.739 [2024-11-29 19:21:33.583506] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.739 [2024-11-29 19:21:33.595712] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.998 [2024-11-29 19:21:33.648106] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:13.999 [2024-11-29 19:21:33.664458] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:08:13.999 INFO: Running with entropic power schedule (0xFF, 100). 00:08:13.999 INFO: Seed: 2811353453 00:08:13.999 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:08:13.999 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:08:13.999 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:13.999 INFO: A corpus is not provided, starting from an empty corpus 00:08:13.999 [2024-11-29 19:21:33.709765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.999 [2024-11-29 19:21:33.709794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.999 #2 INITED cov: 12251 ft: 12238 corp: 1/1b exec/s: 0 rss: 70Mb 00:08:13.999 [2024-11-29 19:21:33.749791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.999 [2024-11-29 19:21:33.749818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.999 #3 NEW cov: 12364 ft: 12671 corp: 2/2b lim: 5 exec/s: 0 rss: 71Mb L: 1/1 MS: 1 ChangeByte- 00:08:13.999 [2024-11-29 19:21:33.810109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.999 [2024-11-29 19:21:33.810137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.999 [2024-11-29 19:21:33.810191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.999 [2024-11-29 19:21:33.810205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.999 #4 NEW cov: 12370 ft: 13682 corp: 3/4b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 CopyPart- 00:08:13.999 [2024-11-29 19:21:33.850225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.999 [2024-11-29 19:21:33.850253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.999 [2024-11-29 19:21:33.850309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.999 [2024-11-29 19:21:33.850323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.999 #5 NEW cov: 12455 ft: 13956 corp: 4/6b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 ChangeBit- 00:08:14.259 [2024-11-29 19:21:33.910368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.259 [2024-11-29 19:21:33.910397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.259 [2024-11-29 19:21:33.910452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.259 [2024-11-29 19:21:33.910466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.259 #6 NEW cov: 12455 ft: 14124 corp: 5/8b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 ChangeBit- 00:08:14.259 [2024-11-29 19:21:33.970562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.259 [2024-11-29 19:21:33.970589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.259 [2024-11-29 19:21:33.970649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.259 [2024-11-29 19:21:33.970664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.259 #7 NEW cov: 12455 ft: 14288 corp: 6/10b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 CopyPart- 00:08:14.259 [2024-11-29 19:21:34.010683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.259 [2024-11-29 19:21:34.010710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.259 [2024-11-29 19:21:34.010766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.259 [2024-11-29 19:21:34.010784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.259 #8 NEW cov: 12455 ft: 14381 corp: 7/12b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 ChangeBinInt- 00:08:14.259 [2024-11-29 19:21:34.071128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.259 [2024-11-29 19:21:34.071155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.259 [2024-11-29 19:21:34.071212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.259 [2024-11-29 19:21:34.071227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.259 [2024-11-29 19:21:34.071279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.259 [2024-11-29 19:21:34.071293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.259 [2024-11-29 19:21:34.071346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.259 [2024-11-29 19:21:34.071360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.259 #9 NEW cov: 12455 ft: 14752 corp: 8/16b lim: 5 exec/s: 0 rss: 71Mb L: 4/4 MS: 1 CrossOver- 00:08:14.259 [2024-11-29 19:21:34.130967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.259 [2024-11-29 19:21:34.130995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.259 [2024-11-29 19:21:34.131051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.259 [2024-11-29 19:21:34.131066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.259 #10 NEW cov: 12455 ft: 14809 corp: 9/18b lim: 5 exec/s: 0 rss: 71Mb L: 2/4 MS: 1 ChangeBit- 00:08:14.519 [2024-11-29 19:21:34.170964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.519 [2024-11-29 19:21:34.170991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.519 #11 NEW cov: 12455 ft: 14859 corp: 10/19b lim: 5 exec/s: 0 rss: 71Mb L: 1/4 MS: 1 EraseBytes- 00:08:14.519 [2024-11-29 19:21:34.211047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.519 [2024-11-29 19:21:34.211075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.519 #12 NEW cov: 12455 ft: 14881 corp: 11/20b lim: 5 exec/s: 0 rss: 71Mb L: 1/4 MS: 1 EraseBytes- 00:08:14.519 [2024-11-29 19:21:34.271341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.519 [2024-11-29 19:21:34.271367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.519 [2024-11-29 19:21:34.271425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.519 [2024-11-29 19:21:34.271442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.519 #13 NEW cov: 12455 ft: 14894 corp: 12/22b lim: 5 exec/s: 0 rss: 71Mb L: 2/4 MS: 1 ShuffleBytes- 00:08:14.519 [2024-11-29 19:21:34.311613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.519 [2024-11-29 19:21:34.311641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.519 [2024-11-29 19:21:34.311698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.519 [2024-11-29 19:21:34.311711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.519 [2024-11-29 19:21:34.311765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.519 [2024-11-29 19:21:34.311779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.519 #14 NEW cov: 12455 ft: 15088 corp: 13/25b lim: 5 exec/s: 0 rss: 71Mb L: 3/4 MS: 1 CopyPart- 00:08:14.519 [2024-11-29 19:21:34.351583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.519 [2024-11-29 19:21:34.351616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.519 [2024-11-29 19:21:34.351673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.519 [2024-11-29 19:21:34.351687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.519 #15 NEW cov: 12455 ft: 15122 corp: 14/27b lim: 5 exec/s: 0 rss: 72Mb L: 2/4 MS: 1 EraseBytes- 00:08:14.519 [2024-11-29 19:21:34.411669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.519 [2024-11-29 19:21:34.411696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.779 #16 NEW cov: 12455 ft: 15136 corp: 15/28b lim: 5 exec/s: 0 rss: 72Mb L: 1/4 MS: 1 EraseBytes- 00:08:14.779 [2024-11-29 19:21:34.451863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.779 [2024-11-29 19:21:34.451890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.779 [2024-11-29 19:21:34.451947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.779 [2024-11-29 19:21:34.451961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.779 #17 NEW cov: 12455 ft: 15177 corp: 16/30b lim: 5 exec/s: 0 rss: 72Mb L: 2/4 MS: 1 ChangeByte- 00:08:14.779 [2024-11-29 19:21:34.491977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.779 [2024-11-29 19:21:34.492003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.779 [2024-11-29 19:21:34.492058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.779 [2024-11-29 19:21:34.492073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.779 #18 NEW cov: 12455 ft: 15185 corp: 17/32b lim: 5 exec/s: 0 rss: 72Mb L: 2/4 MS: 1 CopyPart- 00:08:14.779 [2024-11-29 19:21:34.532383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.779 [2024-11-29 19:21:34.532410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.779 [2024-11-29 19:21:34.532467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.779 [2024-11-29 19:21:34.532481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.779 [2024-11-29 19:21:34.532535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.779 [2024-11-29 19:21:34.532549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.779 [2024-11-29 19:21:34.532605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.779 [2024-11-29 19:21:34.532619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.779 #19 NEW cov: 12455 ft: 15255 corp: 18/36b lim: 5 exec/s: 0 rss: 72Mb L: 4/4 MS: 1 CrossOver- 00:08:14.779 [2024-11-29 19:21:34.592263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.779 [2024-11-29 19:21:34.592290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.779 [2024-11-29 19:21:34.592347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.779 [2024-11-29 19:21:34.592361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.039 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:15.039 #20 NEW cov: 12478 ft: 15287 corp: 19/38b lim: 5 exec/s: 20 rss: 73Mb L: 2/4 MS: 1 CrossOver- 00:08:15.039 [2024-11-29 19:21:34.882967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.039 [2024-11-29 19:21:34.882999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.039 #21 NEW cov: 12478 ft: 15308 corp: 20/39b lim: 5 exec/s: 21 rss: 73Mb L: 1/4 MS: 1 EraseBytes- 00:08:15.039 [2024-11-29 19:21:34.943465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.039 [2024-11-29 19:21:34.943493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.039 [2024-11-29 19:21:34.943548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.039 [2024-11-29 19:21:34.943562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.039 [2024-11-29 19:21:34.943620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.039 [2024-11-29 19:21:34.943635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.039 [2024-11-29 19:21:34.943691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.039 [2024-11-29 19:21:34.943705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.299 #22 NEW cov: 12478 ft: 15312 corp: 21/43b lim: 5 exec/s: 22 rss: 73Mb L: 4/4 MS: 1 CrossOver- 00:08:15.299 [2024-11-29 19:21:34.983552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.299 [2024-11-29 19:21:34.983580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.299 [2024-11-29 19:21:34.983638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.299 [2024-11-29 19:21:34.983653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.299 [2024-11-29 19:21:34.983706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.299 [2024-11-29 19:21:34.983719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.299 [2024-11-29 19:21:34.983771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.299 [2024-11-29 19:21:34.983785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.299 #23 NEW cov: 12478 ft: 15328 corp: 22/47b lim: 5 exec/s: 23 rss: 73Mb L: 4/4 MS: 1 CrossOver- 00:08:15.299 [2024-11-29 19:21:35.043286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.299 [2024-11-29 19:21:35.043314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.299 #24 NEW cov: 12478 ft: 15335 corp: 23/48b lim: 5 exec/s: 24 rss: 73Mb L: 1/4 MS: 1 CopyPart- 00:08:15.299 [2024-11-29 19:21:35.103428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.299 [2024-11-29 19:21:35.103455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.299 #25 NEW cov: 12478 ft: 15376 corp: 24/49b lim: 5 exec/s: 25 rss: 73Mb L: 1/4 MS: 1 EraseBytes- 00:08:15.299 [2024-11-29 19:21:35.143553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.299 [2024-11-29 19:21:35.143579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.299 #26 NEW cov: 12478 ft: 15418 corp: 25/50b lim: 5 exec/s: 26 rss: 73Mb L: 1/4 MS: 1 ChangeBinInt- 00:08:15.299 [2024-11-29 19:21:35.203927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.299 [2024-11-29 19:21:35.203954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.299 [2024-11-29 19:21:35.204010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.299 [2024-11-29 19:21:35.204025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.557 #27 NEW cov: 12478 ft: 15422 corp: 26/52b lim: 5 exec/s: 27 rss: 74Mb L: 2/4 MS: 1 CopyPart- 00:08:15.557 [2024-11-29 19:21:35.264046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.557 [2024-11-29 19:21:35.264073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.557 [2024-11-29 19:21:35.264128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.557 [2024-11-29 19:21:35.264142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.557 #28 NEW cov: 12478 ft: 15440 corp: 27/54b lim: 5 exec/s: 28 rss: 74Mb L: 2/4 MS: 1 ChangeBinInt- 00:08:15.557 [2024-11-29 19:21:35.324518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.557 [2024-11-29 19:21:35.324544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.557 [2024-11-29 19:21:35.324604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.557 [2024-11-29 19:21:35.324618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.558 [2024-11-29 19:21:35.324672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.558 [2024-11-29 19:21:35.324685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.558 [2024-11-29 19:21:35.324755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.558 [2024-11-29 19:21:35.324768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.558 #29 NEW cov: 12478 ft: 15467 corp: 28/58b lim: 5 exec/s: 29 rss: 74Mb L: 4/4 MS: 1 CrossOver- 00:08:15.558 [2024-11-29 19:21:35.384516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.558 [2024-11-29 19:21:35.384544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.558 [2024-11-29 19:21:35.384603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.558 [2024-11-29 19:21:35.384617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.558 [2024-11-29 19:21:35.384667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.558 [2024-11-29 19:21:35.384681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.558 #30 NEW cov: 12478 ft: 15468 corp: 29/61b lim: 5 exec/s: 30 rss: 74Mb L: 3/4 MS: 1 CrossOver- 00:08:15.558 [2024-11-29 19:21:35.444731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.558 [2024-11-29 19:21:35.444759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.558 [2024-11-29 19:21:35.444816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.558 [2024-11-29 19:21:35.444831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.558 [2024-11-29 19:21:35.444887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.558 [2024-11-29 19:21:35.444902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.816 #31 NEW cov: 12478 ft: 15474 corp: 30/64b lim: 5 exec/s: 31 rss: 74Mb L: 3/4 MS: 1 CrossOver- 00:08:15.816 [2024-11-29 19:21:35.505077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.816 [2024-11-29 19:21:35.505104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.816 [2024-11-29 19:21:35.505157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.817 [2024-11-29 19:21:35.505171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.817 [2024-11-29 19:21:35.505225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.817 [2024-11-29 19:21:35.505239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.817 [2024-11-29 19:21:35.505290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.817 [2024-11-29 19:21:35.505303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.817 #32 NEW cov: 12478 ft: 15481 corp: 31/68b lim: 5 exec/s: 32 rss: 74Mb L: 4/4 MS: 1 ShuffleBytes- 00:08:15.817 [2024-11-29 19:21:35.545131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.817 [2024-11-29 19:21:35.545159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.817 [2024-11-29 19:21:35.545212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.817 [2024-11-29 19:21:35.545226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.817 [2024-11-29 19:21:35.545279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.817 [2024-11-29 19:21:35.545293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.817 [2024-11-29 19:21:35.545345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.817 [2024-11-29 19:21:35.545358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.817 #33 NEW cov: 12478 ft: 15506 corp: 32/72b lim: 5 exec/s: 33 rss: 74Mb L: 4/4 MS: 1 CopyPart- 00:08:15.817 [2024-11-29 19:21:35.585155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.817 [2024-11-29 19:21:35.585183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.817 [2024-11-29 19:21:35.585239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.817 [2024-11-29 19:21:35.585257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.817 [2024-11-29 19:21:35.585308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.817 [2024-11-29 19:21:35.585322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.817 #34 NEW cov: 12478 ft: 15529 corp: 33/75b lim: 5 exec/s: 34 rss: 74Mb L: 3/4 MS: 1 CopyPart- 00:08:15.817 [2024-11-29 19:21:35.645112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.817 [2024-11-29 19:21:35.645140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.817 [2024-11-29 19:21:35.645197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.817 [2024-11-29 19:21:35.645211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.817 [2024-11-29 19:21:35.685248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.817 [2024-11-29 19:21:35.685275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.817 [2024-11-29 19:21:35.685328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.817 [2024-11-29 19:21:35.685342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.817 #36 NEW cov: 12478 ft: 15556 corp: 34/77b lim: 5 exec/s: 18 rss: 74Mb L: 2/4 MS: 2 CopyPart-ChangeBinInt- 00:08:15.817 #36 DONE cov: 12478 ft: 15556 corp: 34/77b lim: 5 exec/s: 18 rss: 74Mb 00:08:15.817 Done 36 runs in 2 second(s) 00:08:16.077 19:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_9.conf /var/tmp/suppress_nvmf_fuzz 00:08:16.077 19:21:35 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:16.077 19:21:35 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:16.077 19:21:35 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:08:16.077 19:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:08:16.077 19:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:16.077 19:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:16.077 19:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:16.077 19:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:08:16.077 19:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:16.077 19:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:16.077 19:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 10 00:08:16.077 19:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4410 00:08:16.077 19:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:16.077 19:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:08:16.077 19:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:16.077 19:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:16.077 19:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:16.077 19:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 00:08:16.077 [2024-11-29 19:21:35.851740] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:16.077 [2024-11-29 19:21:35.851821] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1750319 ] 00:08:16.337 [2024-11-29 19:21:36.040251] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.337 [2024-11-29 19:21:36.052541] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.337 [2024-11-29 19:21:36.104981] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:16.337 [2024-11-29 19:21:36.121332] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:08:16.337 INFO: Running with entropic power schedule (0xFF, 100). 00:08:16.337 INFO: Seed: 971379695 00:08:16.337 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:08:16.337 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:08:16.337 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:16.337 INFO: A corpus is not provided, starting from an empty corpus 00:08:16.337 #2 INITED exec/s: 0 rss: 64Mb 00:08:16.337 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:16.337 This may also happen if the target rejected all inputs we tried so far 00:08:16.337 [2024-11-29 19:21:36.170357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9a808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.337 [2024-11-29 19:21:36.170388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.337 [2024-11-29 19:21:36.170447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.337 [2024-11-29 19:21:36.170461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.337 [2024-11-29 19:21:36.170514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.337 [2024-11-29 19:21:36.170527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.596 NEW_FUNC[1/716]: 0x466508 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:08:16.596 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:16.596 #6 NEW cov: 12274 ft: 12273 corp: 2/30b lim: 40 exec/s: 0 rss: 72Mb L: 29/29 MS: 4 ChangeByte-ChangeBit-ChangeBit-InsertRepeatedBytes- 00:08:16.596 [2024-11-29 19:21:36.480994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a019420 cdw11:46096786 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.596 [2024-11-29 19:21:36.481035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.856 #8 NEW cov: 12387 ft: 13245 corp: 3/39b lim: 40 exec/s: 0 rss: 72Mb L: 9/29 MS: 2 ShuffleBytes-CMP- DE: "\001\224 F\011g\206\032"- 00:08:16.856 [2024-11-29 19:21:36.521478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9a808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.856 [2024-11-29 19:21:36.521505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.856 [2024-11-29 19:21:36.521566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.856 [2024-11-29 19:21:36.521580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.856 [2024-11-29 19:21:36.521643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80807878 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.856 [2024-11-29 19:21:36.521657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.856 [2024-11-29 19:21:36.521716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:78787878 cdw11:78787878 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.856 [2024-11-29 19:21:36.521729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.856 [2024-11-29 19:21:36.521784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:78808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.856 [2024-11-29 19:21:36.521798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:16.856 #9 NEW cov: 12393 ft: 13943 corp: 4/79b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:16.856 [2024-11-29 19:21:36.581282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a019420 cdw11:46096701 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.856 [2024-11-29 19:21:36.581404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.856 [2024-11-29 19:21:36.581426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:94204609 cdw11:67861a86 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.856 [2024-11-29 19:21:36.581437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.856 #10 NEW cov: 12487 ft: 14510 corp: 5/96b lim: 40 exec/s: 0 rss: 72Mb L: 17/40 MS: 1 CopyPart- 00:08:16.856 [2024-11-29 19:21:36.641397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a019420 cdw11:46096701 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.856 [2024-11-29 19:21:36.641424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.856 [2024-11-29 19:21:36.641486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:94204609 cdw11:67861a86 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.856 [2024-11-29 19:21:36.641500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.856 #11 NEW cov: 12487 ft: 14603 corp: 6/113b lim: 40 exec/s: 0 rss: 72Mb L: 17/40 MS: 1 ChangeByte- 00:08:16.856 [2024-11-29 19:21:36.701452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a010a01 cdw11:94204609 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.856 [2024-11-29 19:21:36.701478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.856 #13 NEW cov: 12487 ft: 14704 corp: 7/122b lim: 40 exec/s: 0 rss: 72Mb L: 9/40 MS: 2 EraseBytes-CrossOver- 00:08:16.856 [2024-11-29 19:21:36.741804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9a808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.856 [2024-11-29 19:21:36.741830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.856 [2024-11-29 19:21:36.741890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:80808078 cdw11:78787878 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.856 [2024-11-29 19:21:36.741916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.856 [2024-11-29 19:21:36.741974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:78787878 cdw11:78788080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.856 [2024-11-29 19:21:36.741988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.116 #14 NEW cov: 12487 ft: 14810 corp: 8/151b lim: 40 exec/s: 0 rss: 72Mb L: 29/40 MS: 1 EraseBytes- 00:08:17.116 [2024-11-29 19:21:36.802107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a019420 cdw11:ecececec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.116 [2024-11-29 19:21:36.802134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.116 [2024-11-29 19:21:36.802191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ecececec cdw11:ecececec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.116 [2024-11-29 19:21:36.802205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.116 [2024-11-29 19:21:36.802261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ececec46 cdw11:09670194 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.116 [2024-11-29 19:21:36.802275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.116 [2024-11-29 19:21:36.802331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:20460967 cdw11:861a8623 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.116 [2024-11-29 19:21:36.802345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.116 #15 NEW cov: 12487 ft: 14912 corp: 9/183b lim: 40 exec/s: 0 rss: 72Mb L: 32/40 MS: 1 InsertRepeatedBytes- 00:08:17.116 [2024-11-29 19:21:36.862381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9a808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.116 [2024-11-29 19:21:36.862410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.116 [2024-11-29 19:21:36.862472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.116 [2024-11-29 19:21:36.862487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.116 [2024-11-29 19:21:36.862547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:80788080 cdw11:80788080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.116 [2024-11-29 19:21:36.862561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.116 [2024-11-29 19:21:36.862619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:78787878 cdw11:78787878 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.116 [2024-11-29 19:21:36.862633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.116 [2024-11-29 19:21:36.862691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:78808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.116 [2024-11-29 19:21:36.862704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:17.116 #16 NEW cov: 12487 ft: 14992 corp: 10/223b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 ShuffleBytes- 00:08:17.116 [2024-11-29 19:21:36.902376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a019420 cdw11:ecececec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.116 [2024-11-29 19:21:36.902406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.116 [2024-11-29 19:21:36.902464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ecececec cdw11:ecececec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.116 [2024-11-29 19:21:36.902478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.116 [2024-11-29 19:21:36.902538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ececec46 cdw11:09670194 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.116 [2024-11-29 19:21:36.902551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.116 [2024-11-29 19:21:36.902611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2046ec67 cdw11:861a8623 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.116 [2024-11-29 19:21:36.902624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.116 #17 NEW cov: 12487 ft: 15023 corp: 11/255b lim: 40 exec/s: 0 rss: 73Mb L: 32/40 MS: 1 CopyPart- 00:08:17.116 [2024-11-29 19:21:36.962222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00942046 cdw11:50120da6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.116 [2024-11-29 19:21:36.962249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.116 #18 NEW cov: 12487 ft: 15052 corp: 12/264b lim: 40 exec/s: 0 rss: 73Mb L: 9/40 MS: 1 CMP- DE: "\000\224 FP\022\015\246"- 00:08:17.376 [2024-11-29 19:21:37.022695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9a808080 cdw11:3d808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.376 [2024-11-29 19:21:37.022723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.376 [2024-11-29 19:21:37.022784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.376 [2024-11-29 19:21:37.022798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.376 [2024-11-29 19:21:37.022857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.376 [2024-11-29 19:21:37.022871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.376 #19 NEW cov: 12487 ft: 15072 corp: 13/293b lim: 40 exec/s: 0 rss: 73Mb L: 29/40 MS: 1 ChangeByte- 00:08:17.376 [2024-11-29 19:21:37.063036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9a808080 cdw11:3d808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.376 [2024-11-29 19:21:37.063063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.376 [2024-11-29 19:21:37.063124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.376 [2024-11-29 19:21:37.063137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.376 [2024-11-29 19:21:37.063197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.377 [2024-11-29 19:21:37.063211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.377 [2024-11-29 19:21:37.063272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:8080800a cdw11:80800194 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.377 [2024-11-29 19:21:37.063285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.377 [2024-11-29 19:21:37.063343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:20ececec cdw11:ecececec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.377 [2024-11-29 19:21:37.063357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:17.377 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:17.377 #20 NEW cov: 12510 ft: 15118 corp: 14/333b lim: 40 exec/s: 0 rss: 73Mb L: 40/40 MS: 1 CrossOver- 00:08:17.377 [2024-11-29 19:21:37.123057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a019420 cdw11:ecececec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.377 [2024-11-29 19:21:37.123084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.377 [2024-11-29 19:21:37.123142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ecececec cdw11:ecececec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.377 [2024-11-29 19:21:37.123157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.377 [2024-11-29 19:21:37.123213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ececec46 cdw11:09678194 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.377 [2024-11-29 19:21:37.123227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.377 [2024-11-29 19:21:37.123281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:20460967 cdw11:861a8623 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.377 [2024-11-29 19:21:37.123295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.377 #21 NEW cov: 12510 ft: 15179 corp: 15/365b lim: 40 exec/s: 0 rss: 73Mb L: 32/40 MS: 1 ChangeBit- 00:08:17.377 [2024-11-29 19:21:37.163336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9a808080 cdw11:3d808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.377 [2024-11-29 19:21:37.163363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.377 [2024-11-29 19:21:37.163425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.377 [2024-11-29 19:21:37.163439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.377 [2024-11-29 19:21:37.163497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:808f8080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.377 [2024-11-29 19:21:37.163511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.377 [2024-11-29 19:21:37.163565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:8080800a cdw11:80800194 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.377 [2024-11-29 19:21:37.163578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.377 [2024-11-29 19:21:37.163637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:20ececec cdw11:ecececec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.377 [2024-11-29 19:21:37.163651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:17.377 #22 NEW cov: 12510 ft: 15226 corp: 16/405b lim: 40 exec/s: 22 rss: 73Mb L: 40/40 MS: 1 ChangeByte- 00:08:17.377 [2024-11-29 19:21:37.223173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9a808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.377 [2024-11-29 19:21:37.223200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.377 [2024-11-29 19:21:37.223260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:8c808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.377 [2024-11-29 19:21:37.223273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.377 [2024-11-29 19:21:37.223330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.377 [2024-11-29 19:21:37.223344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.377 #23 NEW cov: 12510 ft: 15301 corp: 17/434b lim: 40 exec/s: 23 rss: 73Mb L: 29/40 MS: 1 ChangeByte- 00:08:17.377 [2024-11-29 19:21:37.263344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9a808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.377 [2024-11-29 19:21:37.263371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.377 [2024-11-29 19:21:37.263432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:80808078 cdw11:78787878 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.377 [2024-11-29 19:21:37.263447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.377 [2024-11-29 19:21:37.263504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:78787878 cdw11:78788080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.377 [2024-11-29 19:21:37.263518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.637 #24 NEW cov: 12510 ft: 15312 corp: 18/463b lim: 40 exec/s: 24 rss: 73Mb L: 29/40 MS: 1 CopyPart- 00:08:17.637 [2024-11-29 19:21:37.323232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:009c2046 cdw11:50120da6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.637 [2024-11-29 19:21:37.323259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.637 #25 NEW cov: 12510 ft: 15363 corp: 19/472b lim: 40 exec/s: 25 rss: 73Mb L: 9/40 MS: 1 ChangeBit- 00:08:17.637 [2024-11-29 19:21:37.383668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9a808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.637 [2024-11-29 19:21:37.383695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.637 [2024-11-29 19:21:37.383758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:80808078 cdw11:78787878 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.637 [2024-11-29 19:21:37.383771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.637 [2024-11-29 19:21:37.383830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:78788078 cdw11:78788080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.637 [2024-11-29 19:21:37.383843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.637 #26 NEW cov: 12510 ft: 15393 corp: 20/501b lim: 40 exec/s: 26 rss: 73Mb L: 29/40 MS: 1 ChangeBinInt- 00:08:17.637 [2024-11-29 19:21:37.423528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9a800a80 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.637 [2024-11-29 19:21:37.423555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.637 #27 NEW cov: 12510 ft: 15404 corp: 21/510b lim: 40 exec/s: 27 rss: 73Mb L: 9/40 MS: 1 CrossOver- 00:08:17.637 [2024-11-29 19:21:37.483961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9a808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.637 [2024-11-29 19:21:37.483989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.637 [2024-11-29 19:21:37.484049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.637 [2024-11-29 19:21:37.484064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.637 [2024-11-29 19:21:37.484123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.637 [2024-11-29 19:21:37.484137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.637 #28 NEW cov: 12510 ft: 15416 corp: 22/539b lim: 40 exec/s: 28 rss: 73Mb L: 29/40 MS: 1 ChangeBit- 00:08:17.637 [2024-11-29 19:21:37.523815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:009c2046 cdw11:50120da5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.637 [2024-11-29 19:21:37.523842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.897 #29 NEW cov: 12510 ft: 15435 corp: 23/548b lim: 40 exec/s: 29 rss: 73Mb L: 9/40 MS: 1 ChangeBinInt- 00:08:17.897 [2024-11-29 19:21:37.583972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a010a01 cdw11:94004609 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.897 [2024-11-29 19:21:37.583998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.897 #30 NEW cov: 12510 ft: 15454 corp: 24/557b lim: 40 exec/s: 30 rss: 73Mb L: 9/40 MS: 1 ChangeBit- 00:08:17.897 [2024-11-29 19:21:37.624056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:009c2046 cdw11:50120d80 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.897 [2024-11-29 19:21:37.624082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.897 #31 NEW cov: 12510 ft: 15464 corp: 25/566b lim: 40 exec/s: 31 rss: 73Mb L: 9/40 MS: 1 CrossOver- 00:08:17.897 [2024-11-29 19:21:37.664425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9a808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.897 [2024-11-29 19:21:37.664451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.897 [2024-11-29 19:21:37.664509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:80808078 cdw11:78787878 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.897 [2024-11-29 19:21:37.664524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.897 [2024-11-29 19:21:37.664581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:78787875 cdw11:78787880 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.897 [2024-11-29 19:21:37.664595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.897 #32 NEW cov: 12510 ft: 15486 corp: 26/596b lim: 40 exec/s: 32 rss: 73Mb L: 30/40 MS: 1 InsertByte- 00:08:17.897 [2024-11-29 19:21:37.704797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9a0a8080 cdw11:3d808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.897 [2024-11-29 19:21:37.704823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.897 [2024-11-29 19:21:37.704882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.897 [2024-11-29 19:21:37.704897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.898 [2024-11-29 19:21:37.704955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:808f8080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.898 [2024-11-29 19:21:37.704969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.898 [2024-11-29 19:21:37.705024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:8080800a cdw11:80800194 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.898 [2024-11-29 19:21:37.705038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.898 [2024-11-29 19:21:37.705095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:20ececec cdw11:ecececec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.898 [2024-11-29 19:21:37.705108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:17.898 #33 NEW cov: 12510 ft: 15498 corp: 27/636b lim: 40 exec/s: 33 rss: 73Mb L: 40/40 MS: 1 CrossOver- 00:08:17.898 [2024-11-29 19:21:37.764844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9a808080 cdw11:3d808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.898 [2024-11-29 19:21:37.764870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.898 [2024-11-29 19:21:37.764927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.898 [2024-11-29 19:21:37.764941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.898 [2024-11-29 19:21:37.764996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.898 [2024-11-29 19:21:37.765010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.898 [2024-11-29 19:21:37.765065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:80808080 cdw11:80009420 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.898 [2024-11-29 19:21:37.765078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.898 #34 NEW cov: 12510 ft: 15515 corp: 28/673b lim: 40 exec/s: 34 rss: 73Mb L: 37/40 MS: 1 PersAutoDict- DE: "\000\224 FP\022\015\246"- 00:08:18.158 [2024-11-29 19:21:37.805098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9a0a8080 cdw11:3d808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.158 [2024-11-29 19:21:37.805125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.158 [2024-11-29 19:21:37.805186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.158 [2024-11-29 19:21:37.805199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.158 [2024-11-29 19:21:37.805257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:808f8080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.158 [2024-11-29 19:21:37.805271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.158 [2024-11-29 19:21:37.805326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:8080800a cdw11:80800194 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.158 [2024-11-29 19:21:37.805340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.158 [2024-11-29 19:21:37.805395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:20ececec cdw11:ecec92ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.158 [2024-11-29 19:21:37.805410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:18.158 #35 NEW cov: 12510 ft: 15525 corp: 29/713b lim: 40 exec/s: 35 rss: 74Mb L: 40/40 MS: 1 ChangeByte- 00:08:18.158 [2024-11-29 19:21:37.865125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a019420 cdw11:ecec9420 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.158 [2024-11-29 19:21:37.865152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.158 [2024-11-29 19:21:37.865212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ecececec cdw11:ecececec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.158 [2024-11-29 19:21:37.865226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.158 [2024-11-29 19:21:37.865282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ecececec cdw11:ececec46 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.158 [2024-11-29 19:21:37.865296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.158 [2024-11-29 19:21:37.865353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:09670194 cdw11:861a8623 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.158 [2024-11-29 19:21:37.865366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.158 #36 NEW cov: 12510 ft: 15553 corp: 30/745b lim: 40 exec/s: 36 rss: 74Mb L: 32/40 MS: 1 CopyPart- 00:08:18.158 [2024-11-29 19:21:37.925033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a019420 cdw11:46096701 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.158 [2024-11-29 19:21:37.925060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.158 [2024-11-29 19:21:37.925119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:94204609 cdw11:67861a86 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.158 [2024-11-29 19:21:37.925133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.158 #37 NEW cov: 12510 ft: 15559 corp: 31/763b lim: 40 exec/s: 37 rss: 74Mb L: 18/40 MS: 1 CopyPart- 00:08:18.158 [2024-11-29 19:21:37.965292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9a808080 cdw11:3d808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.158 [2024-11-29 19:21:37.965317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.158 [2024-11-29 19:21:37.965377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.158 [2024-11-29 19:21:37.965391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.158 [2024-11-29 19:21:37.965450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80802380 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.158 [2024-11-29 19:21:37.965464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.158 #38 NEW cov: 12510 ft: 15592 corp: 32/793b lim: 40 exec/s: 38 rss: 74Mb L: 30/40 MS: 1 InsertByte- 00:08:18.158 [2024-11-29 19:21:38.005525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9a808080 cdw11:3d808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.158 [2024-11-29 19:21:38.005551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.158 [2024-11-29 19:21:38.005613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.158 [2024-11-29 19:21:38.005628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.158 [2024-11-29 19:21:38.005682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.158 [2024-11-29 19:21:38.005695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.158 [2024-11-29 19:21:38.005752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:80808080 cdw11:80009420 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.158 [2024-11-29 19:21:38.005765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.158 #39 NEW cov: 12510 ft: 15613 corp: 33/830b lim: 40 exec/s: 39 rss: 74Mb L: 37/40 MS: 1 ShuffleBytes- 00:08:18.419 [2024-11-29 19:21:38.065307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a010a2c cdw11:01940046 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.419 [2024-11-29 19:21:38.065335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.419 #40 NEW cov: 12510 ft: 15662 corp: 34/840b lim: 40 exec/s: 40 rss: 74Mb L: 10/40 MS: 1 InsertByte- 00:08:18.419 [2024-11-29 19:21:38.125854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9a808080 cdw11:3d808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.419 [2024-11-29 19:21:38.125879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.419 [2024-11-29 19:21:38.125935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.419 [2024-11-29 19:21:38.125949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.419 [2024-11-29 19:21:38.126007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:80808080 cdw11:80808080 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.419 [2024-11-29 19:21:38.126021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.419 [2024-11-29 19:21:38.126079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:80808080 cdw11:80009420 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.419 [2024-11-29 19:21:38.126092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.419 #41 NEW cov: 12510 ft: 15683 corp: 35/877b lim: 40 exec/s: 20 rss: 74Mb L: 37/40 MS: 1 ChangeByte- 00:08:18.419 #41 DONE cov: 12510 ft: 15683 corp: 35/877b lim: 40 exec/s: 20 rss: 74Mb 00:08:18.419 ###### Recommended dictionary. ###### 00:08:18.419 "\001\224 F\011g\206\032" # Uses: 0 00:08:18.419 "\000\224 FP\022\015\246" # Uses: 1 00:08:18.419 ###### End of recommended dictionary. ###### 00:08:18.419 Done 41 runs in 2 second(s) 00:08:18.419 19:21:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_10.conf /var/tmp/suppress_nvmf_fuzz 00:08:18.419 19:21:38 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:18.419 19:21:38 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:18.419 19:21:38 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:08:18.419 19:21:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:08:18.419 19:21:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:18.419 19:21:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:18.419 19:21:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:18.419 19:21:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:08:18.419 19:21:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:18.419 19:21:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:18.419 19:21:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 11 00:08:18.419 19:21:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4411 00:08:18.419 19:21:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:18.419 19:21:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:08:18.419 19:21:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:18.419 19:21:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:18.419 19:21:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:18.419 19:21:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 00:08:18.419 [2024-11-29 19:21:38.313720] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:18.419 [2024-11-29 19:21:38.313792] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1750659 ] 00:08:18.679 [2024-11-29 19:21:38.500498] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.679 [2024-11-29 19:21:38.513310] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.679 [2024-11-29 19:21:38.565868] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:18.679 [2024-11-29 19:21:38.582249] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:08:18.938 INFO: Running with entropic power schedule (0xFF, 100). 00:08:18.938 INFO: Seed: 3433378453 00:08:18.938 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:08:18.938 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:08:18.938 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:18.938 INFO: A corpus is not provided, starting from an empty corpus 00:08:18.938 #2 INITED exec/s: 0 rss: 64Mb 00:08:18.938 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:18.938 This may also happen if the target rejected all inputs we tried so far 00:08:18.938 [2024-11-29 19:21:38.648463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.938 [2024-11-29 19:21:38.648502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.196 NEW_FUNC[1/717]: 0x468278 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:19.196 NEW_FUNC[2/717]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:19.196 #5 NEW cov: 12278 ft: 12280 corp: 2/11b lim: 40 exec/s: 0 rss: 72Mb L: 10/10 MS: 3 ChangeBit-CopyPart-InsertRepeatedBytes- 00:08:19.197 [2024-11-29 19:21:38.999479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.197 [2024-11-29 19:21:38.999521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.197 #11 NEW cov: 12399 ft: 12871 corp: 3/21b lim: 40 exec/s: 0 rss: 72Mb L: 10/10 MS: 1 CrossOver- 00:08:19.197 [2024-11-29 19:21:39.069777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.197 [2024-11-29 19:21:39.069805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.197 #12 NEW cov: 12405 ft: 13117 corp: 4/35b lim: 40 exec/s: 0 rss: 72Mb L: 14/14 MS: 1 CopyPart- 00:08:19.456 [2024-11-29 19:21:39.119877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.456 [2024-11-29 19:21:39.119903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.456 #18 NEW cov: 12490 ft: 13411 corp: 5/43b lim: 40 exec/s: 0 rss: 72Mb L: 8/14 MS: 1 EraseBytes- 00:08:19.456 [2024-11-29 19:21:39.170013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.456 [2024-11-29 19:21:39.170040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.456 #19 NEW cov: 12490 ft: 13618 corp: 6/57b lim: 40 exec/s: 0 rss: 72Mb L: 14/14 MS: 1 CMP- DE: "\013\001"- 00:08:19.456 [2024-11-29 19:21:39.240250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:0000000b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.456 [2024-11-29 19:21:39.240277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.456 #20 NEW cov: 12490 ft: 13710 corp: 7/69b lim: 40 exec/s: 0 rss: 72Mb L: 12/14 MS: 1 PersAutoDict- DE: "\013\001"- 00:08:19.456 [2024-11-29 19:21:39.290347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.456 [2024-11-29 19:21:39.290373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.456 #21 NEW cov: 12490 ft: 13774 corp: 8/78b lim: 40 exec/s: 0 rss: 72Mb L: 9/14 MS: 1 InsertByte- 00:08:19.456 [2024-11-29 19:21:39.361613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.456 [2024-11-29 19:21:39.361640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.456 [2024-11-29 19:21:39.361796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.456 [2024-11-29 19:21:39.361813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.456 [2024-11-29 19:21:39.361953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.456 [2024-11-29 19:21:39.361972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.456 [2024-11-29 19:21:39.362110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.456 [2024-11-29 19:21:39.362128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.715 #22 NEW cov: 12490 ft: 14615 corp: 9/114b lim: 40 exec/s: 0 rss: 72Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:08:19.715 [2024-11-29 19:21:39.431456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:002d2d2d cdw11:2d2d2d2d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.715 [2024-11-29 19:21:39.431483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.715 [2024-11-29 19:21:39.431623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2d2d2d2d cdw11:2d2d2d2d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.715 [2024-11-29 19:21:39.431640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.716 [2024-11-29 19:21:39.431784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2d2d2d2d cdw11:2d2d2d2d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.716 [2024-11-29 19:21:39.431802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.716 #25 NEW cov: 12490 ft: 14925 corp: 10/143b lim: 40 exec/s: 0 rss: 72Mb L: 29/36 MS: 3 InsertByte-CrossOver-InsertRepeatedBytes- 00:08:19.716 [2024-11-29 19:21:39.480896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00003a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.716 [2024-11-29 19:21:39.480922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.716 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:19.716 #26 NEW cov: 12513 ft: 15006 corp: 11/157b lim: 40 exec/s: 0 rss: 73Mb L: 14/36 MS: 1 CMP- DE: ":\000\000\000"- 00:08:19.716 [2024-11-29 19:21:39.551511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:0000a31f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.716 [2024-11-29 19:21:39.551540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.716 [2024-11-29 19:21:39.551686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:3bd04720 cdw11:94000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.716 [2024-11-29 19:21:39.551721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.716 #27 NEW cov: 12513 ft: 15229 corp: 12/175b lim: 40 exec/s: 0 rss: 73Mb L: 18/36 MS: 1 CMP- DE: "\243\037;\320G \224\000"- 00:08:19.716 [2024-11-29 19:21:39.601421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.716 [2024-11-29 19:21:39.601449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.975 #29 NEW cov: 12513 ft: 15391 corp: 13/184b lim: 40 exec/s: 0 rss: 73Mb L: 9/36 MS: 2 CopyPart-CMP- DE: "\000\000\000\000\000\000\000H"- 00:08:19.975 [2024-11-29 19:21:39.651567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.975 [2024-11-29 19:21:39.651595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.975 #30 NEW cov: 12513 ft: 15396 corp: 14/193b lim: 40 exec/s: 30 rss: 73Mb L: 9/36 MS: 1 EraseBytes- 00:08:19.975 [2024-11-29 19:21:39.701701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:a31f3bd0 cdw11:47209400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.975 [2024-11-29 19:21:39.701727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.975 #31 NEW cov: 12513 ft: 15407 corp: 15/202b lim: 40 exec/s: 31 rss: 73Mb L: 9/36 MS: 1 PersAutoDict- DE: "\243\037;\320G \224\000"- 00:08:19.975 [2024-11-29 19:21:39.771951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00003a0b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.975 [2024-11-29 19:21:39.771980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.975 #32 NEW cov: 12513 ft: 15448 corp: 16/216b lim: 40 exec/s: 32 rss: 73Mb L: 14/36 MS: 1 PersAutoDict- DE: "\013\001"- 00:08:19.975 [2024-11-29 19:21:39.842135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2a000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.975 [2024-11-29 19:21:39.842163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.235 #33 NEW cov: 12513 ft: 15462 corp: 17/225b lim: 40 exec/s: 33 rss: 73Mb L: 9/36 MS: 1 EraseBytes- 00:08:20.235 [2024-11-29 19:21:39.913268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.235 [2024-11-29 19:21:39.913295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.235 [2024-11-29 19:21:39.913458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.235 [2024-11-29 19:21:39.913477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.235 [2024-11-29 19:21:39.913625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.235 [2024-11-29 19:21:39.913643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.235 [2024-11-29 19:21:39.913785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.235 [2024-11-29 19:21:39.913800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.235 #34 NEW cov: 12513 ft: 15501 corp: 18/263b lim: 40 exec/s: 34 rss: 73Mb L: 38/38 MS: 1 CopyPart- 00:08:20.235 [2024-11-29 19:21:39.982688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00000031 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.235 [2024-11-29 19:21:39.982717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.235 #35 NEW cov: 12513 ft: 15555 corp: 19/273b lim: 40 exec/s: 35 rss: 73Mb L: 10/38 MS: 1 InsertByte- 00:08:20.235 [2024-11-29 19:21:40.053253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00003a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.235 [2024-11-29 19:21:40.053285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.235 [2024-11-29 19:21:40.053431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.235 [2024-11-29 19:21:40.053449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.235 #36 NEW cov: 12513 ft: 15581 corp: 20/295b lim: 40 exec/s: 36 rss: 73Mb L: 22/38 MS: 1 CMP- DE: "\016\000\000\000\000\000\000\000"- 00:08:20.235 [2024-11-29 19:21:40.113422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:002d2d2d cdw11:2d2d2d2d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.235 [2024-11-29 19:21:40.113452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.235 [2024-11-29 19:21:40.113604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2d2d2d2d cdw11:0b013d0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.235 [2024-11-29 19:21:40.113622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.494 #37 NEW cov: 12513 ft: 15653 corp: 21/311b lim: 40 exec/s: 37 rss: 73Mb L: 16/38 MS: 1 EraseBytes- 00:08:20.494 [2024-11-29 19:21:40.183643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.494 [2024-11-29 19:21:40.183673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.494 [2024-11-29 19:21:40.183812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:c5c5c5c5 cdw11:c5c5c5c5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.494 [2024-11-29 19:21:40.183829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.494 #38 NEW cov: 12513 ft: 15672 corp: 22/331b lim: 40 exec/s: 38 rss: 73Mb L: 20/38 MS: 1 InsertRepeatedBytes- 00:08:20.494 [2024-11-29 19:21:40.233497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.494 [2024-11-29 19:21:40.233526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.494 #39 NEW cov: 12513 ft: 15680 corp: 23/341b lim: 40 exec/s: 39 rss: 73Mb L: 10/38 MS: 1 ChangeBit- 00:08:20.494 [2024-11-29 19:21:40.284141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.495 [2024-11-29 19:21:40.284170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.495 [2024-11-29 19:21:40.284316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:c5c5c5c5 cdw11:c5c5c5c5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.495 [2024-11-29 19:21:40.284334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.495 #40 NEW cov: 12513 ft: 15713 corp: 24/361b lim: 40 exec/s: 40 rss: 73Mb L: 20/38 MS: 1 ShuffleBytes- 00:08:20.495 [2024-11-29 19:21:40.354012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:a31f3bd0 cdw11:4720b400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.495 [2024-11-29 19:21:40.354040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.495 #41 NEW cov: 12513 ft: 15831 corp: 25/370b lim: 40 exec/s: 41 rss: 73Mb L: 9/38 MS: 1 ChangeBit- 00:08:20.755 [2024-11-29 19:21:40.424822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:002d2d3a cdw11:0000002d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.755 [2024-11-29 19:21:40.424862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.755 [2024-11-29 19:21:40.425014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2d2d2d2d cdw11:2d2d2d2d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.755 [2024-11-29 19:21:40.425031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.755 [2024-11-29 19:21:40.425165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2d2d2d2d cdw11:2d2d2d2d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.755 [2024-11-29 19:21:40.425184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.755 #42 NEW cov: 12513 ft: 15840 corp: 26/399b lim: 40 exec/s: 42 rss: 73Mb L: 29/38 MS: 1 PersAutoDict- DE: ":\000\000\000"- 00:08:20.755 [2024-11-29 19:21:40.474683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:002d2d2d cdw11:2d2d2d2a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.755 [2024-11-29 19:21:40.474710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.755 [2024-11-29 19:21:40.474867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:002d2d2d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.755 [2024-11-29 19:21:40.474883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.755 #43 NEW cov: 12513 ft: 15897 corp: 27/422b lim: 40 exec/s: 43 rss: 74Mb L: 23/38 MS: 1 CrossOver- 00:08:20.755 [2024-11-29 19:21:40.544935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:002dced2 cdw11:2d2d2d2d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.755 [2024-11-29 19:21:40.544961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.755 [2024-11-29 19:21:40.545113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2d2d2d2d cdw11:0b013d0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.755 [2024-11-29 19:21:40.545130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.755 #44 NEW cov: 12513 ft: 15942 corp: 28/438b lim: 40 exec/s: 44 rss: 74Mb L: 16/38 MS: 1 ChangeBinInt- 00:08:20.755 [2024-11-29 19:21:40.584798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.755 [2024-11-29 19:21:40.584824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.755 #45 NEW cov: 12513 ft: 16004 corp: 29/452b lim: 40 exec/s: 45 rss: 74Mb L: 14/38 MS: 1 ChangeBit- 00:08:20.755 [2024-11-29 19:21:40.634988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2af60000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.755 [2024-11-29 19:21:40.635014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.755 #46 NEW cov: 12513 ft: 16020 corp: 30/466b lim: 40 exec/s: 23 rss: 74Mb L: 14/38 MS: 1 ChangeBinInt- 00:08:20.755 #46 DONE cov: 12513 ft: 16020 corp: 30/466b lim: 40 exec/s: 23 rss: 74Mb 00:08:20.755 ###### Recommended dictionary. ###### 00:08:20.755 "\013\001" # Uses: 2 00:08:20.755 ":\000\000\000" # Uses: 1 00:08:20.755 "\243\037;\320G \224\000" # Uses: 1 00:08:20.755 "\000\000\000\000\000\000\000H" # Uses: 0 00:08:20.755 "\016\000\000\000\000\000\000\000" # Uses: 0 00:08:20.755 ###### End of recommended dictionary. ###### 00:08:20.755 Done 46 runs in 2 second(s) 00:08:21.015 19:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_11.conf /var/tmp/suppress_nvmf_fuzz 00:08:21.015 19:21:40 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:21.015 19:21:40 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:21.015 19:21:40 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:21.015 19:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:21.015 19:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:21.015 19:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:21.015 19:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:21.015 19:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:21.015 19:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:21.015 19:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:21.015 19:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 12 00:08:21.015 19:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4412 00:08:21.015 19:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:21.015 19:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:21.015 19:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:21.015 19:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:21.015 19:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:21.015 19:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 00:08:21.015 [2024-11-29 19:21:40.805193] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:21.015 [2024-11-29 19:21:40.805274] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1751193 ] 00:08:21.274 [2024-11-29 19:21:40.992317] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.274 [2024-11-29 19:21:41.004757] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.274 [2024-11-29 19:21:41.057147] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:21.274 [2024-11-29 19:21:41.073478] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:21.274 INFO: Running with entropic power schedule (0xFF, 100). 00:08:21.274 INFO: Seed: 1628415122 00:08:21.274 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:08:21.274 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:08:21.274 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:21.274 INFO: A corpus is not provided, starting from an empty corpus 00:08:21.274 #2 INITED exec/s: 0 rss: 64Mb 00:08:21.274 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:21.275 This may also happen if the target rejected all inputs we tried so far 00:08:21.275 [2024-11-29 19:21:41.122443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.275 [2024-11-29 19:21:41.122471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.275 [2024-11-29 19:21:41.122530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.275 [2024-11-29 19:21:41.122545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.533 NEW_FUNC[1/717]: 0x469fe8 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:21.533 NEW_FUNC[2/717]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:21.533 #3 NEW cov: 12284 ft: 12278 corp: 2/24b lim: 40 exec/s: 0 rss: 72Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:08:21.792 [2024-11-29 19:21:41.443093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.792 [2024-11-29 19:21:41.443129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.792 #14 NEW cov: 12397 ft: 13541 corp: 3/37b lim: 40 exec/s: 0 rss: 72Mb L: 13/23 MS: 1 InsertRepeatedBytes- 00:08:21.792 [2024-11-29 19:21:41.483099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000025 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.792 [2024-11-29 19:21:41.483126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.792 #15 NEW cov: 12403 ft: 13710 corp: 4/50b lim: 40 exec/s: 0 rss: 72Mb L: 13/23 MS: 1 ChangeByte- 00:08:21.792 [2024-11-29 19:21:41.543569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.792 [2024-11-29 19:21:41.543595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.792 [2024-11-29 19:21:41.543656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.792 [2024-11-29 19:21:41.543671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.792 [2024-11-29 19:21:41.543726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.792 [2024-11-29 19:21:41.543740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.792 #16 NEW cov: 12488 ft: 14193 corp: 5/76b lim: 40 exec/s: 0 rss: 72Mb L: 26/26 MS: 1 CrossOver- 00:08:21.792 [2024-11-29 19:21:41.584257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.792 [2024-11-29 19:21:41.584285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.792 [2024-11-29 19:21:41.584344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:000a0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.792 [2024-11-29 19:21:41.584359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.792 [2024-11-29 19:21:41.584417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.792 [2024-11-29 19:21:41.584431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.792 [2024-11-29 19:21:41.584490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.792 [2024-11-29 19:21:41.584504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.792 [2024-11-29 19:21:41.584562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.792 [2024-11-29 19:21:41.584576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:21.792 #17 NEW cov: 12488 ft: 14600 corp: 6/116b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 CopyPart- 00:08:21.792 [2024-11-29 19:21:41.643531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.792 [2024-11-29 19:21:41.643560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.793 #18 NEW cov: 12488 ft: 14660 corp: 7/129b lim: 40 exec/s: 0 rss: 72Mb L: 13/40 MS: 1 ShuffleBytes- 00:08:22.052 [2024-11-29 19:21:41.703855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.052 [2024-11-29 19:21:41.703881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.052 [2024-11-29 19:21:41.703939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.052 [2024-11-29 19:21:41.703953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.052 #19 NEW cov: 12488 ft: 14795 corp: 8/152b lim: 40 exec/s: 0 rss: 73Mb L: 23/40 MS: 1 ChangeBit- 00:08:22.052 [2024-11-29 19:21:41.764356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.052 [2024-11-29 19:21:41.764383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.052 [2024-11-29 19:21:41.764442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.052 [2024-11-29 19:21:41.764456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.052 [2024-11-29 19:21:41.764514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.052 [2024-11-29 19:21:41.764528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.052 [2024-11-29 19:21:41.764584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.052 [2024-11-29 19:21:41.764602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.052 #20 NEW cov: 12488 ft: 14882 corp: 9/191b lim: 40 exec/s: 0 rss: 73Mb L: 39/40 MS: 1 InsertRepeatedBytes- 00:08:22.052 [2024-11-29 19:21:41.824015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:25000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.052 [2024-11-29 19:21:41.824041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.052 #21 NEW cov: 12488 ft: 14983 corp: 10/201b lim: 40 exec/s: 0 rss: 73Mb L: 10/40 MS: 1 EraseBytes- 00:08:22.052 [2024-11-29 19:21:41.864292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.052 [2024-11-29 19:21:41.864320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.052 [2024-11-29 19:21:41.864378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.052 [2024-11-29 19:21:41.864393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.052 #22 NEW cov: 12488 ft: 15011 corp: 11/224b lim: 40 exec/s: 0 rss: 73Mb L: 23/40 MS: 1 ChangeBit- 00:08:22.052 [2024-11-29 19:21:41.924574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.052 [2024-11-29 19:21:41.924605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.052 [2024-11-29 19:21:41.924664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00fbffff cdw11:f5ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.052 [2024-11-29 19:21:41.924681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.052 [2024-11-29 19:21:41.924737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ff000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.052 [2024-11-29 19:21:41.924751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.052 #28 NEW cov: 12488 ft: 15026 corp: 12/250b lim: 40 exec/s: 0 rss: 73Mb L: 26/40 MS: 1 ChangeBinInt- 00:08:22.311 [2024-11-29 19:21:41.964778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.311 [2024-11-29 19:21:41.964805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.311 [2024-11-29 19:21:41.964864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.311 [2024-11-29 19:21:41.964879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.311 [2024-11-29 19:21:41.964936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000021 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.311 [2024-11-29 19:21:41.964950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.311 #29 NEW cov: 12488 ft: 15142 corp: 13/276b lim: 40 exec/s: 0 rss: 73Mb L: 26/40 MS: 1 ChangeByte- 00:08:22.311 [2024-11-29 19:21:42.004695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.311 [2024-11-29 19:21:42.004722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.311 [2024-11-29 19:21:42.004783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000017 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.311 [2024-11-29 19:21:42.004797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.311 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:22.311 #30 NEW cov: 12511 ft: 15187 corp: 14/299b lim: 40 exec/s: 0 rss: 73Mb L: 23/40 MS: 1 ChangeBinInt- 00:08:22.311 [2024-11-29 19:21:42.065141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.311 [2024-11-29 19:21:42.065169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.311 [2024-11-29 19:21:42.065226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.311 [2024-11-29 19:21:42.065240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.311 [2024-11-29 19:21:42.065295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.311 [2024-11-29 19:21:42.065309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.311 [2024-11-29 19:21:42.065365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:000a0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.311 [2024-11-29 19:21:42.065378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.311 #31 NEW cov: 12511 ft: 15267 corp: 15/338b lim: 40 exec/s: 0 rss: 73Mb L: 39/40 MS: 1 CopyPart- 00:08:22.311 [2024-11-29 19:21:42.104825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:25000000 cdw11:20000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.311 [2024-11-29 19:21:42.104852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.311 #32 NEW cov: 12511 ft: 15327 corp: 16/348b lim: 40 exec/s: 32 rss: 73Mb L: 10/40 MS: 1 ChangeBit- 00:08:22.311 [2024-11-29 19:21:42.165134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.311 [2024-11-29 19:21:42.165160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.311 [2024-11-29 19:21:42.165217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000080 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.311 [2024-11-29 19:21:42.165231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.311 #33 NEW cov: 12511 ft: 15343 corp: 17/365b lim: 40 exec/s: 33 rss: 73Mb L: 17/40 MS: 1 EraseBytes- 00:08:22.571 [2024-11-29 19:21:42.225259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.571 [2024-11-29 19:21:42.225285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.571 [2024-11-29 19:21:42.225343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.572 [2024-11-29 19:21:42.225357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.572 #34 NEW cov: 12511 ft: 15367 corp: 18/388b lim: 40 exec/s: 34 rss: 73Mb L: 23/40 MS: 1 CrossOver- 00:08:22.572 [2024-11-29 19:21:42.265725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.572 [2024-11-29 19:21:42.265751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.572 [2024-11-29 19:21:42.265811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.572 [2024-11-29 19:21:42.265825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.572 [2024-11-29 19:21:42.265882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00feffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.572 [2024-11-29 19:21:42.265895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.572 [2024-11-29 19:21:42.265950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.572 [2024-11-29 19:21:42.265964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.572 #35 NEW cov: 12511 ft: 15385 corp: 19/427b lim: 40 exec/s: 35 rss: 73Mb L: 39/40 MS: 1 ChangeBinInt- 00:08:22.572 [2024-11-29 19:21:42.325591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.572 [2024-11-29 19:21:42.325621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.572 [2024-11-29 19:21:42.325679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000022 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.572 [2024-11-29 19:21:42.325697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.572 #36 NEW cov: 12511 ft: 15405 corp: 20/445b lim: 40 exec/s: 36 rss: 73Mb L: 18/40 MS: 1 InsertByte- 00:08:22.572 [2024-11-29 19:21:42.385568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:25000000 cdw11:20000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.572 [2024-11-29 19:21:42.385595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.572 #37 NEW cov: 12511 ft: 15424 corp: 21/453b lim: 40 exec/s: 37 rss: 74Mb L: 8/40 MS: 1 EraseBytes- 00:08:22.572 [2024-11-29 19:21:42.446224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.572 [2024-11-29 19:21:42.446250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.572 [2024-11-29 19:21:42.446309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.572 [2024-11-29 19:21:42.446323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.572 [2024-11-29 19:21:42.446378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.572 [2024-11-29 19:21:42.446392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.572 [2024-11-29 19:21:42.446447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:000a0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.572 [2024-11-29 19:21:42.446460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.832 #38 NEW cov: 12511 ft: 15452 corp: 22/492b lim: 40 exec/s: 38 rss: 74Mb L: 39/40 MS: 1 ShuffleBytes- 00:08:22.832 [2024-11-29 19:21:42.506066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.832 [2024-11-29 19:21:42.506092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.832 [2024-11-29 19:21:42.506151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.832 [2024-11-29 19:21:42.506165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.832 #39 NEW cov: 12511 ft: 15482 corp: 23/515b lim: 40 exec/s: 39 rss: 74Mb L: 23/40 MS: 1 ChangeByte- 00:08:22.832 [2024-11-29 19:21:42.546328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.832 [2024-11-29 19:21:42.546355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.832 [2024-11-29 19:21:42.546414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:25000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.832 [2024-11-29 19:21:42.546429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.832 [2024-11-29 19:21:42.546486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00002500 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.832 [2024-11-29 19:21:42.546499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.832 #40 NEW cov: 12511 ft: 15510 corp: 24/542b lim: 40 exec/s: 40 rss: 74Mb L: 27/40 MS: 1 CrossOver- 00:08:22.832 [2024-11-29 19:21:42.606338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a020000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.832 [2024-11-29 19:21:42.606364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.832 [2024-11-29 19:21:42.606423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.832 [2024-11-29 19:21:42.606437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.832 #41 NEW cov: 12511 ft: 15525 corp: 25/565b lim: 40 exec/s: 41 rss: 74Mb L: 23/40 MS: 1 ShuffleBytes- 00:08:22.832 [2024-11-29 19:21:42.646977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.832 [2024-11-29 19:21:42.647002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.832 [2024-11-29 19:21:42.647062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.832 [2024-11-29 19:21:42.647076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.832 [2024-11-29 19:21:42.647133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.832 [2024-11-29 19:21:42.647147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.832 [2024-11-29 19:21:42.647203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:000a0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.832 [2024-11-29 19:21:42.647217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.832 [2024-11-29 19:21:42.647273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:005d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.832 [2024-11-29 19:21:42.647287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:22.832 #42 NEW cov: 12511 ft: 15535 corp: 26/605b lim: 40 exec/s: 42 rss: 74Mb L: 40/40 MS: 1 InsertByte- 00:08:22.832 [2024-11-29 19:21:42.707149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.832 [2024-11-29 19:21:42.707174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.832 [2024-11-29 19:21:42.707232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.832 [2024-11-29 19:21:42.707246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.832 [2024-11-29 19:21:42.707303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.832 [2024-11-29 19:21:42.707316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.832 [2024-11-29 19:21:42.707375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00300000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.832 [2024-11-29 19:21:42.707388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.832 [2024-11-29 19:21:42.707446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:005d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.832 [2024-11-29 19:21:42.707461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:23.091 #43 NEW cov: 12511 ft: 15548 corp: 27/645b lim: 40 exec/s: 43 rss: 74Mb L: 40/40 MS: 1 ChangeByte- 00:08:23.091 [2024-11-29 19:21:42.767014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.092 [2024-11-29 19:21:42.767040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.092 [2024-11-29 19:21:42.767098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.092 [2024-11-29 19:21:42.767112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.092 [2024-11-29 19:21:42.767170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0000008b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.092 [2024-11-29 19:21:42.767184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.092 #44 NEW cov: 12511 ft: 15556 corp: 28/671b lim: 40 exec/s: 44 rss: 74Mb L: 26/40 MS: 1 ChangeByte- 00:08:23.092 [2024-11-29 19:21:42.806742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.092 [2024-11-29 19:21:42.806767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.092 #46 NEW cov: 12511 ft: 15567 corp: 29/680b lim: 40 exec/s: 46 rss: 74Mb L: 9/40 MS: 2 ChangeBinInt-CrossOver- 00:08:23.092 [2024-11-29 19:21:42.846872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.092 [2024-11-29 19:21:42.846897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.092 #47 NEW cov: 12511 ft: 15582 corp: 30/693b lim: 40 exec/s: 47 rss: 74Mb L: 13/40 MS: 1 ChangeBinInt- 00:08:23.092 [2024-11-29 19:21:42.887002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2500e800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.092 [2024-11-29 19:21:42.887029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.092 #48 NEW cov: 12511 ft: 15589 corp: 31/704b lim: 40 exec/s: 48 rss: 74Mb L: 11/40 MS: 1 InsertByte- 00:08:23.092 [2024-11-29 19:21:42.927300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.092 [2024-11-29 19:21:42.927326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.092 [2024-11-29 19:21:42.927385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.092 [2024-11-29 19:21:42.927399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.092 #49 NEW cov: 12511 ft: 15600 corp: 32/727b lim: 40 exec/s: 49 rss: 74Mb L: 23/40 MS: 1 ChangeByte- 00:08:23.092 [2024-11-29 19:21:42.987624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.092 [2024-11-29 19:21:42.987651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.092 [2024-11-29 19:21:42.987711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:fbfffff5 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.092 [2024-11-29 19:21:42.987726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.092 [2024-11-29 19:21:42.987779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.092 [2024-11-29 19:21:42.987793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.351 #50 NEW cov: 12511 ft: 15676 corp: 33/752b lim: 40 exec/s: 50 rss: 74Mb L: 25/40 MS: 1 EraseBytes- 00:08:23.351 [2024-11-29 19:21:43.047436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.351 [2024-11-29 19:21:43.047462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.351 #51 NEW cov: 12511 ft: 15689 corp: 34/765b lim: 40 exec/s: 51 rss: 74Mb L: 13/40 MS: 1 ChangeBinInt- 00:08:23.351 [2024-11-29 19:21:43.087841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a020000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.351 [2024-11-29 19:21:43.087867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.351 [2024-11-29 19:21:43.087926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.351 [2024-11-29 19:21:43.087941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.351 [2024-11-29 19:21:43.087999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.351 [2024-11-29 19:21:43.088012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.351 #52 NEW cov: 12511 ft: 15704 corp: 35/795b lim: 40 exec/s: 26 rss: 74Mb L: 30/40 MS: 1 InsertRepeatedBytes- 00:08:23.351 #52 DONE cov: 12511 ft: 15704 corp: 35/795b lim: 40 exec/s: 26 rss: 74Mb 00:08:23.351 Done 52 runs in 2 second(s) 00:08:23.351 19:21:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_12.conf /var/tmp/suppress_nvmf_fuzz 00:08:23.351 19:21:43 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:23.351 19:21:43 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:23.351 19:21:43 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:23.351 19:21:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:23.352 19:21:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:23.352 19:21:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:23.352 19:21:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:23.352 19:21:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:23.352 19:21:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:23.352 19:21:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:23.352 19:21:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 13 00:08:23.352 19:21:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4413 00:08:23.352 19:21:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:23.352 19:21:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:23.352 19:21:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:23.352 19:21:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:23.352 19:21:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:23.352 19:21:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 00:08:23.610 [2024-11-29 19:21:43.276708] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:23.610 [2024-11-29 19:21:43.276777] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1751533 ] 00:08:23.610 [2024-11-29 19:21:43.463816] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.610 [2024-11-29 19:21:43.476476] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.869 [2024-11-29 19:21:43.528974] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:23.869 [2024-11-29 19:21:43.545356] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:23.869 INFO: Running with entropic power schedule (0xFF, 100). 00:08:23.869 INFO: Seed: 4102426673 00:08:23.869 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:08:23.869 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:08:23.869 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:23.869 INFO: A corpus is not provided, starting from an empty corpus 00:08:23.869 #2 INITED exec/s: 0 rss: 65Mb 00:08:23.869 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:23.869 This may also happen if the target rejected all inputs we tried so far 00:08:23.869 [2024-11-29 19:21:43.622208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.869 [2024-11-29 19:21:43.622247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.869 [2024-11-29 19:21:43.622374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.869 [2024-11-29 19:21:43.622391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.869 [2024-11-29 19:21:43.622514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.869 [2024-11-29 19:21:43.622530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.869 [2024-11-29 19:21:43.622658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.869 [2024-11-29 19:21:43.622675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.129 NEW_FUNC[1/713]: 0x46bbb8 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:24.129 NEW_FUNC[2/713]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:24.129 #23 NEW cov: 12225 ft: 12222 corp: 2/36b lim: 40 exec/s: 0 rss: 72Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:24.129 [2024-11-29 19:21:43.953121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.129 [2024-11-29 19:21:43.953176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.129 [2024-11-29 19:21:43.953317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.129 [2024-11-29 19:21:43.953339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.129 [2024-11-29 19:21:43.953488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.129 [2024-11-29 19:21:43.953511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.129 [2024-11-29 19:21:43.953651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.129 [2024-11-29 19:21:43.953674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.129 NEW_FUNC[1/3]: 0x17e6658 in nvme_ctrlr_process_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ctrlr.c:3959 00:08:24.129 NEW_FUNC[2/3]: 0x19c2988 in spdk_nvme_probe_poll_async /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme.c:1615 00:08:24.129 #24 NEW cov: 12384 ft: 12929 corp: 3/68b lim: 40 exec/s: 0 rss: 72Mb L: 32/35 MS: 1 EraseBytes- 00:08:24.129 [2024-11-29 19:21:44.033230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.129 [2024-11-29 19:21:44.033257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.129 [2024-11-29 19:21:44.033386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.129 [2024-11-29 19:21:44.033402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.129 [2024-11-29 19:21:44.033534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.129 [2024-11-29 19:21:44.033549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.129 [2024-11-29 19:21:44.033678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:8d8d8d8d cdw11:8d8d738d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.129 [2024-11-29 19:21:44.033694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.388 #25 NEW cov: 12390 ft: 13115 corp: 4/103b lim: 40 exec/s: 0 rss: 72Mb L: 35/35 MS: 1 ChangeBinInt- 00:08:24.388 [2024-11-29 19:21:44.072986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.388 [2024-11-29 19:21:44.073013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.388 [2024-11-29 19:21:44.073151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.388 [2024-11-29 19:21:44.073170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.388 [2024-11-29 19:21:44.073302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.388 [2024-11-29 19:21:44.073319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.388 #26 NEW cov: 12475 ft: 13890 corp: 5/127b lim: 40 exec/s: 0 rss: 72Mb L: 24/35 MS: 1 EraseBytes- 00:08:24.388 [2024-11-29 19:21:44.123155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.388 [2024-11-29 19:21:44.123183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.388 [2024-11-29 19:21:44.123312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.388 [2024-11-29 19:21:44.123327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.388 [2024-11-29 19:21:44.123463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.388 [2024-11-29 19:21:44.123478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.388 #27 NEW cov: 12475 ft: 13950 corp: 6/153b lim: 40 exec/s: 0 rss: 72Mb L: 26/35 MS: 1 EraseBytes- 00:08:24.388 [2024-11-29 19:21:44.173505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d787272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.388 [2024-11-29 19:21:44.173533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.388 [2024-11-29 19:21:44.173683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:72727272 cdw11:728d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.388 [2024-11-29 19:21:44.173700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.388 [2024-11-29 19:21:44.173829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.388 [2024-11-29 19:21:44.173848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.388 [2024-11-29 19:21:44.173979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.388 [2024-11-29 19:21:44.173995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.388 #28 NEW cov: 12475 ft: 14063 corp: 7/185b lim: 40 exec/s: 0 rss: 73Mb L: 32/35 MS: 1 ChangeBinInt- 00:08:24.388 [2024-11-29 19:21:44.243493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.388 [2024-11-29 19:21:44.243519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.388 [2024-11-29 19:21:44.243638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d0a8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.388 [2024-11-29 19:21:44.243655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.388 [2024-11-29 19:21:44.243796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.388 [2024-11-29 19:21:44.243811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.388 #29 NEW cov: 12475 ft: 14159 corp: 8/210b lim: 40 exec/s: 0 rss: 73Mb L: 25/35 MS: 1 CrossOver- 00:08:24.647 [2024-11-29 19:21:44.313954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.647 [2024-11-29 19:21:44.313986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.647 [2024-11-29 19:21:44.314113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.647 [2024-11-29 19:21:44.314129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.647 [2024-11-29 19:21:44.314246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.647 [2024-11-29 19:21:44.314262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.647 [2024-11-29 19:21:44.314379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:8d8d738d cdw11:8d8d738d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.647 [2024-11-29 19:21:44.314395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.647 #30 NEW cov: 12475 ft: 14189 corp: 9/245b lim: 40 exec/s: 0 rss: 73Mb L: 35/35 MS: 1 CopyPart- 00:08:24.647 [2024-11-29 19:21:44.383478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.647 [2024-11-29 19:21:44.383504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.647 #31 NEW cov: 12475 ft: 14565 corp: 10/260b lim: 40 exec/s: 0 rss: 73Mb L: 15/35 MS: 1 EraseBytes- 00:08:24.647 [2024-11-29 19:21:44.454100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.647 [2024-11-29 19:21:44.454126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.647 [2024-11-29 19:21:44.454262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d3b8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.647 [2024-11-29 19:21:44.454278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.647 [2024-11-29 19:21:44.454405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.647 [2024-11-29 19:21:44.454421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.647 #32 NEW cov: 12475 ft: 14657 corp: 11/284b lim: 40 exec/s: 0 rss: 73Mb L: 24/35 MS: 1 ChangeByte- 00:08:24.647 [2024-11-29 19:21:44.503863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8df600 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.647 [2024-11-29 19:21:44.503889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.647 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:24.647 #36 NEW cov: 12498 ft: 14700 corp: 12/295b lim: 40 exec/s: 0 rss: 73Mb L: 11/35 MS: 4 CopyPart-ChangeBinInt-CrossOver-CMP- DE: "\000\000\000\017"- 00:08:24.907 [2024-11-29 19:21:44.554732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.907 [2024-11-29 19:21:44.554759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.907 [2024-11-29 19:21:44.554882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d0a8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.907 [2024-11-29 19:21:44.554900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.907 [2024-11-29 19:21:44.555017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.907 [2024-11-29 19:21:44.555033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.907 [2024-11-29 19:21:44.555158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.907 [2024-11-29 19:21:44.555174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.907 #37 NEW cov: 12498 ft: 14748 corp: 13/328b lim: 40 exec/s: 0 rss: 73Mb L: 33/35 MS: 1 InsertRepeatedBytes- 00:08:24.907 [2024-11-29 19:21:44.604896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.907 [2024-11-29 19:21:44.604923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.907 [2024-11-29 19:21:44.605046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.907 [2024-11-29 19:21:44.605062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.907 [2024-11-29 19:21:44.605193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.907 [2024-11-29 19:21:44.605208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.907 [2024-11-29 19:21:44.605334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:8d8d8d8d cdw11:8d8d738d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.907 [2024-11-29 19:21:44.605350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.907 #38 NEW cov: 12498 ft: 14801 corp: 14/365b lim: 40 exec/s: 38 rss: 73Mb L: 37/37 MS: 1 CMP- DE: "\001\004"- 00:08:24.907 [2024-11-29 19:21:44.654797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.907 [2024-11-29 19:21:44.654823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.907 [2024-11-29 19:21:44.654950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.907 [2024-11-29 19:21:44.654965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.907 [2024-11-29 19:21:44.655098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.907 [2024-11-29 19:21:44.655115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.907 #39 NEW cov: 12498 ft: 14829 corp: 15/395b lim: 40 exec/s: 39 rss: 73Mb L: 30/37 MS: 1 CrossOver- 00:08:24.907 [2024-11-29 19:21:44.725078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.907 [2024-11-29 19:21:44.725106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.907 [2024-11-29 19:21:44.725253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.907 [2024-11-29 19:21:44.725269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.907 [2024-11-29 19:21:44.725389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.907 [2024-11-29 19:21:44.725407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.907 #40 NEW cov: 12498 ft: 14909 corp: 16/425b lim: 40 exec/s: 40 rss: 73Mb L: 30/37 MS: 1 CopyPart- 00:08:24.907 [2024-11-29 19:21:44.795528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.907 [2024-11-29 19:21:44.795556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.907 [2024-11-29 19:21:44.795715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.907 [2024-11-29 19:21:44.795731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.907 [2024-11-29 19:21:44.795866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.907 [2024-11-29 19:21:44.795882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.907 [2024-11-29 19:21:44.796014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.908 [2024-11-29 19:21:44.796031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.167 #41 NEW cov: 12498 ft: 14976 corp: 17/461b lim: 40 exec/s: 41 rss: 73Mb L: 36/37 MS: 1 CopyPart- 00:08:25.167 [2024-11-29 19:21:44.845364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.167 [2024-11-29 19:21:44.845392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.167 [2024-11-29 19:21:44.845524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d3b8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.167 [2024-11-29 19:21:44.845541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.167 [2024-11-29 19:21:44.845700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.167 [2024-11-29 19:21:44.845716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.167 #42 NEW cov: 12498 ft: 14994 corp: 18/485b lim: 40 exec/s: 42 rss: 73Mb L: 24/37 MS: 1 CrossOver- 00:08:25.167 [2024-11-29 19:21:44.915810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.167 [2024-11-29 19:21:44.915838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.167 [2024-11-29 19:21:44.915960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.167 [2024-11-29 19:21:44.915975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.167 [2024-11-29 19:21:44.916107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.167 [2024-11-29 19:21:44.916123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.167 [2024-11-29 19:21:44.916259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.167 [2024-11-29 19:21:44.916276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.167 #43 NEW cov: 12498 ft: 15007 corp: 19/518b lim: 40 exec/s: 43 rss: 73Mb L: 33/37 MS: 1 InsertByte- 00:08:25.167 [2024-11-29 19:21:44.965762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:0f8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.167 [2024-11-29 19:21:44.965792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.167 [2024-11-29 19:21:44.965930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d3b8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.167 [2024-11-29 19:21:44.965948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.167 [2024-11-29 19:21:44.966084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.167 [2024-11-29 19:21:44.966101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.167 #44 NEW cov: 12498 ft: 15038 corp: 20/546b lim: 40 exec/s: 44 rss: 73Mb L: 28/37 MS: 1 PersAutoDict- DE: "\000\000\000\017"- 00:08:25.167 [2024-11-29 19:21:45.035835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.167 [2024-11-29 19:21:45.035862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.167 [2024-11-29 19:21:45.036005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d3b8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.167 [2024-11-29 19:21:45.036021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.167 [2024-11-29 19:21:45.036150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.167 [2024-11-29 19:21:45.036166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.167 #45 NEW cov: 12498 ft: 15066 corp: 21/573b lim: 40 exec/s: 45 rss: 73Mb L: 27/37 MS: 1 CrossOver- 00:08:25.425 [2024-11-29 19:21:45.086431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.425 [2024-11-29 19:21:45.086458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.425 [2024-11-29 19:21:45.086603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8daa cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.425 [2024-11-29 19:21:45.086622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.425 [2024-11-29 19:21:45.086747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.425 [2024-11-29 19:21:45.086766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.425 [2024-11-29 19:21:45.086912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.425 [2024-11-29 19:21:45.086928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.425 #46 NEW cov: 12498 ft: 15072 corp: 22/608b lim: 40 exec/s: 46 rss: 73Mb L: 35/37 MS: 1 ChangeByte- 00:08:25.425 [2024-11-29 19:21:45.136545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:0f8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.425 [2024-11-29 19:21:45.136575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.425 [2024-11-29 19:21:45.136710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8dffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.425 [2024-11-29 19:21:45.136727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.425 [2024-11-29 19:21:45.136865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.425 [2024-11-29 19:21:45.136882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.425 [2024-11-29 19:21:45.137018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:3b8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.425 [2024-11-29 19:21:45.137037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.425 #47 NEW cov: 12498 ft: 15089 corp: 23/646b lim: 40 exec/s: 47 rss: 73Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:25.425 [2024-11-29 19:21:45.206716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.425 [2024-11-29 19:21:45.206745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.425 [2024-11-29 19:21:45.206873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.425 [2024-11-29 19:21:45.206889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.425 [2024-11-29 19:21:45.207015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.426 [2024-11-29 19:21:45.207032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.426 [2024-11-29 19:21:45.207162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:8d8d0000 cdw11:000f8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.426 [2024-11-29 19:21:45.207179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.426 #48 NEW cov: 12498 ft: 15113 corp: 24/681b lim: 40 exec/s: 48 rss: 73Mb L: 35/38 MS: 1 PersAutoDict- DE: "\000\000\000\017"- 00:08:25.426 [2024-11-29 19:21:45.256813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.426 [2024-11-29 19:21:45.256839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.426 [2024-11-29 19:21:45.256984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.426 [2024-11-29 19:21:45.257008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.426 [2024-11-29 19:21:45.257135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d01 cdw11:048d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.426 [2024-11-29 19:21:45.257152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.426 [2024-11-29 19:21:45.257276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:8d8d8d8d cdw11:8d8d738d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.426 [2024-11-29 19:21:45.257291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.426 #49 NEW cov: 12498 ft: 15121 corp: 25/716b lim: 40 exec/s: 49 rss: 73Mb L: 35/38 MS: 1 PersAutoDict- DE: "\001\004"- 00:08:25.426 [2024-11-29 19:21:45.306287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.426 [2024-11-29 19:21:45.306313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.426 #50 NEW cov: 12498 ft: 15156 corp: 26/731b lim: 40 exec/s: 50 rss: 73Mb L: 15/38 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:25.684 [2024-11-29 19:21:45.356934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d878d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.684 [2024-11-29 19:21:45.356960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.684 [2024-11-29 19:21:45.357097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.684 [2024-11-29 19:21:45.357113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.684 [2024-11-29 19:21:45.357247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.684 [2024-11-29 19:21:45.357264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.684 #51 NEW cov: 12498 ft: 15175 corp: 27/762b lim: 40 exec/s: 51 rss: 73Mb L: 31/38 MS: 1 InsertByte- 00:08:25.684 [2024-11-29 19:21:45.407313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d0a8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.684 [2024-11-29 19:21:45.407339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.684 [2024-11-29 19:21:45.407471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.684 [2024-11-29 19:21:45.407486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.684 [2024-11-29 19:21:45.407628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.684 [2024-11-29 19:21:45.407644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.684 [2024-11-29 19:21:45.407772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.684 [2024-11-29 19:21:45.407788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.684 #52 NEW cov: 12498 ft: 15186 corp: 28/799b lim: 40 exec/s: 52 rss: 74Mb L: 37/38 MS: 1 CrossOver- 00:08:25.684 [2024-11-29 19:21:45.477554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d0a8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.684 [2024-11-29 19:21:45.477579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.684 [2024-11-29 19:21:45.477709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d3b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.684 [2024-11-29 19:21:45.477725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.684 [2024-11-29 19:21:45.477842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.684 [2024-11-29 19:21:45.477858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.684 [2024-11-29 19:21:45.477972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:8d8d8d8d cdw11:8d3b8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.684 [2024-11-29 19:21:45.477988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.684 #53 NEW cov: 12498 ft: 15200 corp: 29/836b lim: 40 exec/s: 53 rss: 74Mb L: 37/38 MS: 1 CrossOver- 00:08:25.684 [2024-11-29 19:21:45.527702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.684 [2024-11-29 19:21:45.527728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.684 [2024-11-29 19:21:45.527856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.684 [2024-11-29 19:21:45.527873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.684 [2024-11-29 19:21:45.527994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d01 cdw11:048d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.685 [2024-11-29 19:21:45.528010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.685 [2024-11-29 19:21:45.528136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:8d8d8d8d cdw11:8d8d0104 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.685 [2024-11-29 19:21:45.528152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.685 #54 NEW cov: 12498 ft: 15220 corp: 30/871b lim: 40 exec/s: 54 rss: 74Mb L: 35/38 MS: 1 PersAutoDict- DE: "\001\004"- 00:08:25.685 [2024-11-29 19:21:45.587660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.685 [2024-11-29 19:21:45.587686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.685 [2024-11-29 19:21:45.587818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.685 [2024-11-29 19:21:45.587835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.685 [2024-11-29 19:21:45.587968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.685 [2024-11-29 19:21:45.587987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.943 #55 NEW cov: 12498 ft: 15331 corp: 31/895b lim: 40 exec/s: 27 rss: 74Mb L: 24/38 MS: 1 ShuffleBytes- 00:08:25.943 #55 DONE cov: 12498 ft: 15331 corp: 31/895b lim: 40 exec/s: 27 rss: 74Mb 00:08:25.943 ###### Recommended dictionary. ###### 00:08:25.943 "\000\000\000\017" # Uses: 2 00:08:25.943 "\001\004" # Uses: 2 00:08:25.943 "\000\000\000\000\000\000\000\000" # Uses: 0 00:08:25.943 ###### End of recommended dictionary. ###### 00:08:25.943 Done 55 runs in 2 second(s) 00:08:25.943 19:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_13.conf /var/tmp/suppress_nvmf_fuzz 00:08:25.943 19:21:45 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:25.943 19:21:45 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:25.943 19:21:45 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:25.943 19:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:25.943 19:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:25.943 19:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:25.943 19:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:25.943 19:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:25.943 19:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:25.943 19:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:25.943 19:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 14 00:08:25.943 19:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4414 00:08:25.943 19:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:25.943 19:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:25.943 19:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:25.943 19:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:25.943 19:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:25.943 19:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 00:08:25.943 [2024-11-29 19:21:45.756850] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:25.943 [2024-11-29 19:21:45.756922] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1752012 ] 00:08:26.202 [2024-11-29 19:21:45.939816] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:26.202 [2024-11-29 19:21:45.952086] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.202 [2024-11-29 19:21:46.004665] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:26.202 [2024-11-29 19:21:46.020976] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:26.202 INFO: Running with entropic power schedule (0xFF, 100). 00:08:26.202 INFO: Seed: 2281448552 00:08:26.202 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:08:26.202 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:08:26.202 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:26.202 INFO: A corpus is not provided, starting from an empty corpus 00:08:26.202 #2 INITED exec/s: 0 rss: 64Mb 00:08:26.202 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:26.202 This may also happen if the target rejected all inputs we tried so far 00:08:26.202 [2024-11-29 19:21:46.070207] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.202 [2024-11-29 19:21:46.070235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.460 NEW_FUNC[1/717]: 0x46d788 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:26.460 NEW_FUNC[2/717]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:26.460 #11 NEW cov: 12254 ft: 12260 corp: 2/14b lim: 35 exec/s: 0 rss: 72Mb L: 13/13 MS: 4 CopyPart-ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:08:26.719 [2024-11-29 19:21:46.381071] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.719 [2024-11-29 19:21:46.381105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.719 #12 NEW cov: 12379 ft: 12721 corp: 3/24b lim: 35 exec/s: 0 rss: 72Mb L: 10/13 MS: 1 EraseBytes- 00:08:26.719 [2024-11-29 19:21:46.441325] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.719 [2024-11-29 19:21:46.441354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.720 [2024-11-29 19:21:46.441417] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.720 [2024-11-29 19:21:46.441432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.720 #18 NEW cov: 12385 ft: 13775 corp: 4/38b lim: 35 exec/s: 0 rss: 72Mb L: 14/14 MS: 1 InsertByte- 00:08:26.720 [2024-11-29 19:21:46.481246] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.720 [2024-11-29 19:21:46.481275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.720 #19 NEW cov: 12470 ft: 14061 corp: 5/48b lim: 35 exec/s: 0 rss: 72Mb L: 10/14 MS: 1 CopyPart- 00:08:26.720 [2024-11-29 19:21:46.541928] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.720 [2024-11-29 19:21:46.541954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.720 [2024-11-29 19:21:46.542018] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000028 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.720 [2024-11-29 19:21:46.542033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.720 [2024-11-29 19:21:46.542093] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000028 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.720 [2024-11-29 19:21:46.542108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.720 [2024-11-29 19:21:46.542168] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000028 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.720 [2024-11-29 19:21:46.542181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.720 #20 NEW cov: 12470 ft: 14524 corp: 6/77b lim: 35 exec/s: 0 rss: 72Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:08:26.720 [2024-11-29 19:21:46.601724] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.720 [2024-11-29 19:21:46.601751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.720 [2024-11-29 19:21:46.601817] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.720 [2024-11-29 19:21:46.601833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.980 #21 NEW cov: 12470 ft: 14606 corp: 7/97b lim: 35 exec/s: 0 rss: 72Mb L: 20/29 MS: 1 CopyPart- 00:08:26.980 [2024-11-29 19:21:46.661885] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.980 [2024-11-29 19:21:46.661911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.980 [2024-11-29 19:21:46.661973] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.980 [2024-11-29 19:21:46.661987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.980 #22 NEW cov: 12470 ft: 14671 corp: 8/112b lim: 35 exec/s: 0 rss: 72Mb L: 15/29 MS: 1 EraseBytes- 00:08:26.980 [2024-11-29 19:21:46.721875] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.980 [2024-11-29 19:21:46.721901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.980 #23 NEW cov: 12470 ft: 14772 corp: 9/125b lim: 35 exec/s: 0 rss: 72Mb L: 13/29 MS: 1 ChangeByte- 00:08:26.980 [2024-11-29 19:21:46.761995] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.980 [2024-11-29 19:21:46.762022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.980 #24 NEW cov: 12470 ft: 14877 corp: 10/135b lim: 35 exec/s: 0 rss: 72Mb L: 10/29 MS: 1 ChangeBit- 00:08:26.980 [2024-11-29 19:21:46.802577] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.980 [2024-11-29 19:21:46.802608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.980 [2024-11-29 19:21:46.802672] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.980 [2024-11-29 19:21:46.802685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.980 [2024-11-29 19:21:46.802745] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000028 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.980 [2024-11-29 19:21:46.802758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.980 [2024-11-29 19:21:46.802820] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000028 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.980 [2024-11-29 19:21:46.802833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.980 #25 NEW cov: 12470 ft: 14956 corp: 11/169b lim: 35 exec/s: 0 rss: 73Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:08:26.980 [2024-11-29 19:21:46.862459] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.980 [2024-11-29 19:21:46.862485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.980 [2024-11-29 19:21:46.862547] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000028 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.980 [2024-11-29 19:21:46.862561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.240 #26 NEW cov: 12470 ft: 14989 corp: 12/187b lim: 35 exec/s: 0 rss: 73Mb L: 18/34 MS: 1 CrossOver- 00:08:27.240 [2024-11-29 19:21:46.922976] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.240 [2024-11-29 19:21:46.923002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.240 [2024-11-29 19:21:46.923066] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000028 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.240 [2024-11-29 19:21:46.923079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.240 [2024-11-29 19:21:46.923140] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000028 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.240 [2024-11-29 19:21:46.923154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.240 [2024-11-29 19:21:46.923215] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000028 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.240 [2024-11-29 19:21:46.923229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.240 #27 NEW cov: 12470 ft: 15044 corp: 13/216b lim: 35 exec/s: 0 rss: 73Mb L: 29/34 MS: 1 ShuffleBytes- 00:08:27.240 [2024-11-29 19:21:46.962572] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.240 [2024-11-29 19:21:46.962604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.240 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:27.240 #28 NEW cov: 12493 ft: 15158 corp: 14/226b lim: 35 exec/s: 0 rss: 73Mb L: 10/34 MS: 1 CopyPart- 00:08:27.240 [2024-11-29 19:21:47.002667] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.240 [2024-11-29 19:21:47.002694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.240 #29 NEW cov: 12493 ft: 15174 corp: 15/236b lim: 35 exec/s: 0 rss: 73Mb L: 10/34 MS: 1 ShuffleBytes- 00:08:27.240 [2024-11-29 19:21:47.043307] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.240 [2024-11-29 19:21:47.043333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.240 [2024-11-29 19:21:47.043397] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.240 [2024-11-29 19:21:47.043412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.240 [2024-11-29 19:21:47.043475] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.240 [2024-11-29 19:21:47.043489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.240 [2024-11-29 19:21:47.043550] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000028 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.240 [2024-11-29 19:21:47.043564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.240 #30 NEW cov: 12493 ft: 15199 corp: 16/270b lim: 35 exec/s: 30 rss: 73Mb L: 34/34 MS: 1 ChangeBinInt- 00:08:27.240 [2024-11-29 19:21:47.102944] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.240 [2024-11-29 19:21:47.102971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.240 #31 NEW cov: 12493 ft: 15317 corp: 17/280b lim: 35 exec/s: 31 rss: 73Mb L: 10/34 MS: 1 CrossOver- 00:08:27.240 [2024-11-29 19:21:47.143184] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.240 [2024-11-29 19:21:47.143211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.240 [2024-11-29 19:21:47.143274] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.240 [2024-11-29 19:21:47.143289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.500 #32 NEW cov: 12493 ft: 15420 corp: 18/296b lim: 35 exec/s: 32 rss: 73Mb L: 16/34 MS: 1 CopyPart- 00:08:27.500 [2024-11-29 19:21:47.203248] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.500 [2024-11-29 19:21:47.203275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.500 #33 NEW cov: 12493 ft: 15437 corp: 19/306b lim: 35 exec/s: 33 rss: 73Mb L: 10/34 MS: 1 ShuffleBytes- 00:08:27.500 [2024-11-29 19:21:47.263543] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.500 [2024-11-29 19:21:47.263570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.500 [2024-11-29 19:21:47.263646] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.500 [2024-11-29 19:21:47.263661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.500 #34 NEW cov: 12493 ft: 15452 corp: 20/320b lim: 35 exec/s: 34 rss: 73Mb L: 14/34 MS: 1 ChangeASCIIInt- 00:08:27.500 [2024-11-29 19:21:47.303516] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.500 [2024-11-29 19:21:47.303543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.500 #35 NEW cov: 12493 ft: 15467 corp: 21/331b lim: 35 exec/s: 35 rss: 73Mb L: 11/34 MS: 1 CrossOver- 00:08:27.500 [2024-11-29 19:21:47.343825] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.500 [2024-11-29 19:21:47.343851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.500 [2024-11-29 19:21:47.343916] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.500 [2024-11-29 19:21:47.343930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.500 #36 NEW cov: 12493 ft: 15475 corp: 22/348b lim: 35 exec/s: 36 rss: 73Mb L: 17/34 MS: 1 InsertByte- 00:08:27.500 [2024-11-29 19:21:47.404398] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.500 [2024-11-29 19:21:47.404425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.500 [2024-11-29 19:21:47.404489] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.500 [2024-11-29 19:21:47.404503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.500 [2024-11-29 19:21:47.404566] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000028 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.500 [2024-11-29 19:21:47.404581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.500 [2024-11-29 19:21:47.404646] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000028 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.500 [2024-11-29 19:21:47.404660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.761 #42 NEW cov: 12493 ft: 15476 corp: 23/382b lim: 35 exec/s: 42 rss: 73Mb L: 34/34 MS: 1 ChangeBit- 00:08:27.761 [2024-11-29 19:21:47.444439] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.761 [2024-11-29 19:21:47.444465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.761 [2024-11-29 19:21:47.444527] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000027 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.761 [2024-11-29 19:21:47.444542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.761 [2024-11-29 19:21:47.444602] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000028 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.761 [2024-11-29 19:21:47.444617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.761 [2024-11-29 19:21:47.444678] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000028 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.761 [2024-11-29 19:21:47.444692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.761 #43 NEW cov: 12493 ft: 15482 corp: 24/411b lim: 35 exec/s: 43 rss: 73Mb L: 29/34 MS: 1 ChangeBinInt- 00:08:27.761 [2024-11-29 19:21:47.484534] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.761 [2024-11-29 19:21:47.484562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.761 [2024-11-29 19:21:47.484629] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.761 [2024-11-29 19:21:47.484644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.761 [2024-11-29 19:21:47.484707] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.761 [2024-11-29 19:21:47.484721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.761 [2024-11-29 19:21:47.484780] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.761 [2024-11-29 19:21:47.484793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.761 #44 NEW cov: 12493 ft: 15510 corp: 25/439b lim: 35 exec/s: 44 rss: 73Mb L: 28/34 MS: 1 InsertRepeatedBytes- 00:08:27.761 [2024-11-29 19:21:47.544577] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.761 [2024-11-29 19:21:47.544609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.761 [2024-11-29 19:21:47.544672] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.761 [2024-11-29 19:21:47.544687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.761 [2024-11-29 19:21:47.544745] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.761 [2024-11-29 19:21:47.544762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.761 #45 NEW cov: 12493 ft: 15692 corp: 26/463b lim: 35 exec/s: 45 rss: 73Mb L: 24/34 MS: 1 CrossOver- 00:08:27.761 [2024-11-29 19:21:47.584299] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.761 [2024-11-29 19:21:47.584326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.761 #46 NEW cov: 12493 ft: 15695 corp: 27/473b lim: 35 exec/s: 46 rss: 73Mb L: 10/34 MS: 1 ShuffleBytes- 00:08:27.761 NEW_FUNC[1/2]: 0x48b638 in feat_error_recover /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:304 00:08:28.021 NEW_FUNC[2/2]: 0x13a84b8 in nvmf_ctrlr_set_features_error_recovery /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1726 00:08:28.021 #51 NEW cov: 12547 ft: 15823 corp: 28/480b lim: 35 exec/s: 51 rss: 74Mb L: 7/34 MS: 5 CrossOver-ChangeBinInt-InsertByte-ChangeBinInt-InsertByte- 00:08:28.021 [2024-11-29 19:21:47.685153] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.021 [2024-11-29 19:21:47.685181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.021 [2024-11-29 19:21:47.685243] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.021 [2024-11-29 19:21:47.685260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.021 [2024-11-29 19:21:47.685316] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.021 [2024-11-29 19:21:47.685330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.021 [2024-11-29 19:21:47.685388] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.021 [2024-11-29 19:21:47.685401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.021 #52 NEW cov: 12554 ft: 15838 corp: 29/508b lim: 35 exec/s: 52 rss: 74Mb L: 28/34 MS: 1 ChangeBinInt- 00:08:28.021 [2024-11-29 19:21:47.745293] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.021 [2024-11-29 19:21:47.745321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.021 [2024-11-29 19:21:47.745384] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.021 [2024-11-29 19:21:47.745398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.021 [2024-11-29 19:21:47.745459] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.021 [2024-11-29 19:21:47.745473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.021 [2024-11-29 19:21:47.745531] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.021 [2024-11-29 19:21:47.745545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.021 #53 NEW cov: 12554 ft: 15853 corp: 30/536b lim: 35 exec/s: 53 rss: 74Mb L: 28/34 MS: 1 ChangeBit- 00:08:28.021 [2024-11-29 19:21:47.784854] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.021 [2024-11-29 19:21:47.784881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.021 #54 NEW cov: 12554 ft: 15872 corp: 31/546b lim: 35 exec/s: 54 rss: 74Mb L: 10/34 MS: 1 CopyPart- 00:08:28.021 [2024-11-29 19:21:47.824998] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000dd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.021 [2024-11-29 19:21:47.825024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.021 #55 NEW cov: 12554 ft: 15878 corp: 32/559b lim: 35 exec/s: 55 rss: 74Mb L: 13/34 MS: 1 ChangeByte- 00:08:28.021 [2024-11-29 19:21:47.865127] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.021 [2024-11-29 19:21:47.865154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.021 #56 NEW cov: 12554 ft: 15884 corp: 33/569b lim: 35 exec/s: 56 rss: 74Mb L: 10/34 MS: 1 ChangeBit- 00:08:28.021 [2024-11-29 19:21:47.925875] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.021 [2024-11-29 19:21:47.925902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.021 [2024-11-29 19:21:47.925966] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.021 [2024-11-29 19:21:47.925980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.021 [2024-11-29 19:21:47.926045] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.021 [2024-11-29 19:21:47.926059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.021 [2024-11-29 19:21:47.926120] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000020 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.021 [2024-11-29 19:21:47.926134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.282 #57 NEW cov: 12554 ft: 15902 corp: 34/599b lim: 35 exec/s: 57 rss: 74Mb L: 30/34 MS: 1 CrossOver- 00:08:28.282 [2024-11-29 19:21:47.965408] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.282 [2024-11-29 19:21:47.965435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.282 #58 NEW cov: 12554 ft: 15909 corp: 35/609b lim: 35 exec/s: 58 rss: 74Mb L: 10/34 MS: 1 EraseBytes- 00:08:28.282 [2024-11-29 19:21:48.005529] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.282 [2024-11-29 19:21:48.005555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.282 #59 NEW cov: 12554 ft: 15921 corp: 36/621b lim: 35 exec/s: 59 rss: 74Mb L: 12/34 MS: 1 EraseBytes- 00:08:28.282 [2024-11-29 19:21:48.045626] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.282 [2024-11-29 19:21:48.045653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.282 #60 NEW cov: 12554 ft: 15956 corp: 37/630b lim: 35 exec/s: 30 rss: 74Mb L: 9/34 MS: 1 EraseBytes- 00:08:28.282 #60 DONE cov: 12554 ft: 15956 corp: 37/630b lim: 35 exec/s: 30 rss: 74Mb 00:08:28.282 Done 60 runs in 2 second(s) 00:08:28.282 19:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_14.conf /var/tmp/suppress_nvmf_fuzz 00:08:28.282 19:21:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:28.282 19:21:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:28.282 19:21:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:28.282 19:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:28.282 19:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:28.282 19:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:28.282 19:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:28.282 19:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:28.282 19:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:28.282 19:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:28.282 19:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 15 00:08:28.282 19:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4415 00:08:28.282 19:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:28.282 19:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:28.282 19:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:28.282 19:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:28.282 19:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:28.282 19:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 00:08:28.542 [2024-11-29 19:21:48.211313] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:28.542 [2024-11-29 19:21:48.211384] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1752537 ] 00:08:28.542 [2024-11-29 19:21:48.401406] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.542 [2024-11-29 19:21:48.413594] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:28.802 [2024-11-29 19:21:48.466161] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:28.802 [2024-11-29 19:21:48.482526] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:28.802 INFO: Running with entropic power schedule (0xFF, 100). 00:08:28.802 INFO: Seed: 447481689 00:08:28.802 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:08:28.802 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:08:28.802 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:28.802 INFO: A corpus is not provided, starting from an empty corpus 00:08:28.802 #2 INITED exec/s: 0 rss: 64Mb 00:08:28.802 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:28.802 This may also happen if the target rejected all inputs we tried so far 00:08:28.802 [2024-11-29 19:21:48.531382] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.802 [2024-11-29 19:21:48.531411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.802 [2024-11-29 19:21:48.531473] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.802 [2024-11-29 19:21:48.531488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.062 NEW_FUNC[1/715]: 0x46ecc8 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:29.062 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:29.062 #5 NEW cov: 12253 ft: 12242 corp: 2/16b lim: 35 exec/s: 0 rss: 72Mb L: 15/15 MS: 3 ShuffleBytes-ChangeBinInt-InsertRepeatedBytes- 00:08:29.062 [2024-11-29 19:21:48.852160] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.062 [2024-11-29 19:21:48.852193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.062 [2024-11-29 19:21:48.852256] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.062 [2024-11-29 19:21:48.852271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.062 NEW_FUNC[1/1]: 0x1a67298 in nvme_tcp_read_data /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk_internal/nvme_tcp.h:405 00:08:29.062 #6 NEW cov: 12367 ft: 12972 corp: 3/31b lim: 35 exec/s: 0 rss: 72Mb L: 15/15 MS: 1 ChangeBit- 00:08:29.062 [2024-11-29 19:21:48.912526] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.062 [2024-11-29 19:21:48.912554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.062 [2024-11-29 19:21:48.912618] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.062 [2024-11-29 19:21:48.912632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.062 [2024-11-29 19:21:48.912691] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.062 [2024-11-29 19:21:48.912705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.062 [2024-11-29 19:21:48.912765] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.062 [2024-11-29 19:21:48.912779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.062 #7 NEW cov: 12373 ft: 13680 corp: 4/63b lim: 35 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:08:29.062 [2024-11-29 19:21:48.952179] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.062 [2024-11-29 19:21:48.952205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.321 #8 NEW cov: 12458 ft: 14213 corp: 5/71b lim: 35 exec/s: 0 rss: 72Mb L: 8/32 MS: 1 EraseBytes- 00:08:29.321 [2024-11-29 19:21:48.992426] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.321 [2024-11-29 19:21:48.992452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.321 [2024-11-29 19:21:48.992514] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000000ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.321 [2024-11-29 19:21:48.992528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.321 #9 NEW cov: 12458 ft: 14303 corp: 6/86b lim: 35 exec/s: 0 rss: 72Mb L: 15/32 MS: 1 ChangeBinInt- 00:08:29.321 NEW_FUNC[1/1]: 0x48ecd8 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:29.321 #11 NEW cov: 12472 ft: 14490 corp: 7/95b lim: 35 exec/s: 0 rss: 72Mb L: 9/32 MS: 2 CrossOver-CMP- DE: "\012\000\000\000"- 00:08:29.321 [2024-11-29 19:21:49.092996] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.321 [2024-11-29 19:21:49.093022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.321 [2024-11-29 19:21:49.093082] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.321 [2024-11-29 19:21:49.093096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.321 [2024-11-29 19:21:49.093151] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.321 [2024-11-29 19:21:49.093165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.321 [2024-11-29 19:21:49.093222] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.321 [2024-11-29 19:21:49.093235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.321 #12 NEW cov: 12472 ft: 14592 corp: 8/127b lim: 35 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 ChangeByte- 00:08:29.321 #13 NEW cov: 12472 ft: 14616 corp: 9/140b lim: 35 exec/s: 0 rss: 72Mb L: 13/32 MS: 1 PersAutoDict- DE: "\012\000\000\000"- 00:08:29.580 #19 NEW cov: 12472 ft: 14681 corp: 10/149b lim: 35 exec/s: 0 rss: 72Mb L: 9/32 MS: 1 ChangeBinInt- 00:08:29.580 [2024-11-29 19:21:49.253124] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000447 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.580 [2024-11-29 19:21:49.253154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.580 [2024-11-29 19:21:49.253219] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.580 [2024-11-29 19:21:49.253234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.580 #20 NEW cov: 12472 ft: 14724 corp: 11/164b lim: 35 exec/s: 0 rss: 73Mb L: 15/32 MS: 1 CMP- DE: "G\221\022sM \224\000"- 00:08:29.580 [2024-11-29 19:21:49.313334] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ea SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.580 [2024-11-29 19:21:49.313360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.580 [2024-11-29 19:21:49.313423] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.580 [2024-11-29 19:21:49.313440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.580 #21 NEW cov: 12472 ft: 14744 corp: 12/179b lim: 35 exec/s: 0 rss: 73Mb L: 15/32 MS: 1 ChangeByte- 00:08:29.580 [2024-11-29 19:21:49.353301] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000447 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.580 [2024-11-29 19:21:49.353326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.580 #22 NEW cov: 12472 ft: 14774 corp: 13/187b lim: 35 exec/s: 0 rss: 73Mb L: 8/32 MS: 1 EraseBytes- 00:08:29.580 [2024-11-29 19:21:49.413571] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.580 [2024-11-29 19:21:49.413603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.580 [2024-11-29 19:21:49.413682] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.580 [2024-11-29 19:21:49.413697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.580 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:29.580 #23 NEW cov: 12495 ft: 14845 corp: 14/202b lim: 35 exec/s: 0 rss: 73Mb L: 15/32 MS: 1 ChangeBit- 00:08:29.580 [2024-11-29 19:21:49.453744] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.580 [2024-11-29 19:21:49.453769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.580 [2024-11-29 19:21:49.453831] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.580 [2024-11-29 19:21:49.453845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.580 #24 NEW cov: 12495 ft: 14852 corp: 15/217b lim: 35 exec/s: 0 rss: 73Mb L: 15/32 MS: 1 CopyPart- 00:08:29.839 [2024-11-29 19:21:49.493721] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.839 [2024-11-29 19:21:49.493746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.839 #25 NEW cov: 12495 ft: 14870 corp: 16/230b lim: 35 exec/s: 25 rss: 73Mb L: 13/32 MS: 1 EraseBytes- 00:08:29.839 [2024-11-29 19:21:49.533841] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000014 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.839 [2024-11-29 19:21:49.533866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.839 #26 NEW cov: 12495 ft: 14882 corp: 17/238b lim: 35 exec/s: 26 rss: 73Mb L: 8/32 MS: 1 ChangeBinInt- 00:08:29.839 [2024-11-29 19:21:49.594144] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000447 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.839 [2024-11-29 19:21:49.594170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.839 [2024-11-29 19:21:49.594233] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.839 [2024-11-29 19:21:49.594247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.839 #27 NEW cov: 12495 ft: 14913 corp: 18/253b lim: 35 exec/s: 27 rss: 73Mb L: 15/32 MS: 1 CopyPart- 00:08:29.839 [2024-11-29 19:21:49.634235] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000726 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.839 [2024-11-29 19:21:49.634260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.839 [2024-11-29 19:21:49.634326] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.839 [2024-11-29 19:21:49.634339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.839 #28 NEW cov: 12495 ft: 14922 corp: 19/269b lim: 35 exec/s: 28 rss: 73Mb L: 16/32 MS: 1 InsertByte- 00:08:29.839 [2024-11-29 19:21:49.674224] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.839 [2024-11-29 19:21:49.674250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.839 #29 NEW cov: 12495 ft: 14932 corp: 20/277b lim: 35 exec/s: 29 rss: 73Mb L: 8/32 MS: 1 ChangeBinInt- 00:08:29.839 [2024-11-29 19:21:49.714614] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.839 [2024-11-29 19:21:49.714640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.839 [2024-11-29 19:21:49.714707] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.839 [2024-11-29 19:21:49.714721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.839 [2024-11-29 19:21:49.714780] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.839 [2024-11-29 19:21:49.714793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.098 #30 NEW cov: 12495 ft: 15093 corp: 21/298b lim: 35 exec/s: 30 rss: 73Mb L: 21/32 MS: 1 CMP- DE: "\001\000\000\000\000\000\003\377"- 00:08:30.098 [2024-11-29 19:21:49.774538] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000014 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.098 [2024-11-29 19:21:49.774564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.098 #31 NEW cov: 12495 ft: 15114 corp: 22/306b lim: 35 exec/s: 31 rss: 73Mb L: 8/32 MS: 1 ChangeBit- 00:08:30.098 #32 NEW cov: 12495 ft: 15154 corp: 23/319b lim: 35 exec/s: 32 rss: 73Mb L: 13/32 MS: 1 ShuffleBytes- 00:08:30.098 #33 NEW cov: 12495 ft: 15163 corp: 24/328b lim: 35 exec/s: 33 rss: 73Mb L: 9/32 MS: 1 ChangeByte- 00:08:30.098 [2024-11-29 19:21:49.935110] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.098 [2024-11-29 19:21:49.935135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.098 [2024-11-29 19:21:49.935213] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000064d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.098 [2024-11-29 19:21:49.935228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.098 #34 NEW cov: 12495 ft: 15212 corp: 25/344b lim: 35 exec/s: 34 rss: 73Mb L: 16/32 MS: 1 CMP- DE: "\000\224 M\331j\357\024"- 00:08:30.098 [2024-11-29 19:21:49.995418] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.098 [2024-11-29 19:21:49.995443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.098 [2024-11-29 19:21:49.995504] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.098 [2024-11-29 19:21:49.995519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.098 [2024-11-29 19:21:49.995579] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.098 [2024-11-29 19:21:49.995593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.357 #35 NEW cov: 12495 ft: 15280 corp: 26/365b lim: 35 exec/s: 35 rss: 73Mb L: 21/32 MS: 1 ShuffleBytes- 00:08:30.357 [2024-11-29 19:21:50.055495] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000042f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.357 [2024-11-29 19:21:50.055523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.357 [2024-11-29 19:21:50.055588] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.358 [2024-11-29 19:21:50.055609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.358 #36 NEW cov: 12495 ft: 15298 corp: 27/380b lim: 35 exec/s: 36 rss: 74Mb L: 15/32 MS: 1 ChangeByte- 00:08:30.358 [2024-11-29 19:21:50.115512] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.358 [2024-11-29 19:21:50.115546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.358 #37 NEW cov: 12495 ft: 15341 corp: 28/388b lim: 35 exec/s: 37 rss: 74Mb L: 8/32 MS: 1 ShuffleBytes- 00:08:30.358 [2024-11-29 19:21:50.155758] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000447 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.358 [2024-11-29 19:21:50.155785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.358 [2024-11-29 19:21:50.155848] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.358 [2024-11-29 19:21:50.155863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.358 #38 NEW cov: 12495 ft: 15351 corp: 29/403b lim: 35 exec/s: 38 rss: 74Mb L: 15/32 MS: 1 ChangeBit- 00:08:30.358 [2024-11-29 19:21:50.195847] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000726 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.358 [2024-11-29 19:21:50.195872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.358 [2024-11-29 19:21:50.195934] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.358 [2024-11-29 19:21:50.195948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.358 #39 NEW cov: 12495 ft: 15360 corp: 30/423b lim: 35 exec/s: 39 rss: 74Mb L: 20/32 MS: 1 InsertRepeatedBytes- 00:08:30.358 [2024-11-29 19:21:50.256144] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.358 [2024-11-29 19:21:50.256169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.358 [2024-11-29 19:21:50.256229] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.358 [2024-11-29 19:21:50.256244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.358 [2024-11-29 19:21:50.256300] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.358 [2024-11-29 19:21:50.256313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.616 #40 NEW cov: 12495 ft: 15382 corp: 31/446b lim: 35 exec/s: 40 rss: 74Mb L: 23/32 MS: 1 CMP- DE: "\000\000\000\000\001\000\000\000"- 00:08:30.616 [2024-11-29 19:21:50.296156] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000447 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.616 [2024-11-29 19:21:50.296182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.616 [2024-11-29 19:21:50.296244] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.616 [2024-11-29 19:21:50.296259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.616 #41 NEW cov: 12495 ft: 15402 corp: 32/460b lim: 35 exec/s: 41 rss: 74Mb L: 14/32 MS: 1 CopyPart- 00:08:30.616 [2024-11-29 19:21:50.356152] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.616 [2024-11-29 19:21:50.356177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.616 #42 NEW cov: 12495 ft: 15456 corp: 33/468b lim: 35 exec/s: 42 rss: 74Mb L: 8/32 MS: 1 CopyPart- 00:08:30.616 [2024-11-29 19:21:50.396382] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000042f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.616 [2024-11-29 19:21:50.396407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.616 [2024-11-29 19:21:50.396467] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.616 [2024-11-29 19:21:50.396481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.616 #43 NEW cov: 12495 ft: 15488 corp: 34/483b lim: 35 exec/s: 43 rss: 74Mb L: 15/32 MS: 1 PersAutoDict- DE: "\012\000\000\000"- 00:08:30.616 [2024-11-29 19:21:50.456824] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.616 [2024-11-29 19:21:50.456849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.616 [2024-11-29 19:21:50.456906] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ec SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.616 [2024-11-29 19:21:50.456920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.616 [2024-11-29 19:21:50.456976] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.616 [2024-11-29 19:21:50.456990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.616 [2024-11-29 19:21:50.457046] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.616 [2024-11-29 19:21:50.457060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.616 #44 NEW cov: 12495 ft: 15492 corp: 35/515b lim: 35 exec/s: 44 rss: 74Mb L: 32/32 MS: 1 ChangeBinInt- 00:08:30.616 [2024-11-29 19:21:50.496549] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000447 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.616 [2024-11-29 19:21:50.496573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.616 #45 NEW cov: 12495 ft: 15510 corp: 36/527b lim: 35 exec/s: 22 rss: 74Mb L: 12/32 MS: 1 EraseBytes- 00:08:30.616 #45 DONE cov: 12495 ft: 15510 corp: 36/527b lim: 35 exec/s: 22 rss: 74Mb 00:08:30.616 ###### Recommended dictionary. ###### 00:08:30.616 "\012\000\000\000" # Uses: 2 00:08:30.616 "G\221\022sM \224\000" # Uses: 0 00:08:30.616 "\001\000\000\000\000\000\003\377" # Uses: 0 00:08:30.616 "\000\224 M\331j\357\024" # Uses: 0 00:08:30.616 "\000\000\000\000\001\000\000\000" # Uses: 0 00:08:30.616 ###### End of recommended dictionary. ###### 00:08:30.616 Done 45 runs in 2 second(s) 00:08:30.875 19:21:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_15.conf /var/tmp/suppress_nvmf_fuzz 00:08:30.875 19:21:50 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:30.875 19:21:50 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:30.875 19:21:50 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:30.875 19:21:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:30.875 19:21:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:30.875 19:21:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:30.875 19:21:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:30.875 19:21:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:30.875 19:21:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:30.875 19:21:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:30.875 19:21:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 16 00:08:30.875 19:21:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4416 00:08:30.875 19:21:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:30.875 19:21:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:30.875 19:21:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:30.875 19:21:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:30.875 19:21:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:30.875 19:21:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 00:08:30.875 [2024-11-29 19:21:50.661287] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:30.875 [2024-11-29 19:21:50.661371] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1752833 ] 00:08:31.135 [2024-11-29 19:21:50.857381] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:31.135 [2024-11-29 19:21:50.869965] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.135 [2024-11-29 19:21:50.922577] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:31.135 [2024-11-29 19:21:50.938939] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:31.135 INFO: Running with entropic power schedule (0xFF, 100). 00:08:31.135 INFO: Seed: 2905489696 00:08:31.135 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:08:31.135 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:08:31.135 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:31.135 INFO: A corpus is not provided, starting from an empty corpus 00:08:31.135 #2 INITED exec/s: 0 rss: 64Mb 00:08:31.135 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:31.135 This may also happen if the target rejected all inputs we tried so far 00:08:31.135 [2024-11-29 19:21:51.014811] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.135 [2024-11-29 19:21:51.014850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.655 NEW_FUNC[1/717]: 0x470188 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:31.655 NEW_FUNC[2/717]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:31.655 #19 NEW cov: 12340 ft: 12313 corp: 2/34b lim: 105 exec/s: 0 rss: 72Mb L: 33/33 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:31.655 [2024-11-29 19:21:51.345743] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-29 19:21:51.345784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.655 #20 NEW cov: 12470 ft: 12787 corp: 3/67b lim: 105 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 ShuffleBytes- 00:08:31.655 [2024-11-29 19:21:51.416202] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2531906049332683555 len:8996 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-29 19:21:51.416238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.655 [2024-11-29 19:21:51.416338] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2531906049332683555 len:8996 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-29 19:21:51.416363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.655 [2024-11-29 19:21:51.416479] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:2531906049332683555 len:8996 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-29 19:21:51.416502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.655 #23 NEW cov: 12476 ft: 13504 corp: 4/139b lim: 105 exec/s: 0 rss: 72Mb L: 72/72 MS: 3 ChangeBit-CrossOver-InsertRepeatedBytes- 00:08:31.655 [2024-11-29 19:21:51.465882] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-29 19:21:51.465909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.655 #24 NEW cov: 12561 ft: 13730 corp: 5/180b lim: 105 exec/s: 0 rss: 72Mb L: 41/72 MS: 1 CMP- DE: "\002\000\000\000\000\000\000\000"- 00:08:31.655 [2024-11-29 19:21:51.516084] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.655 [2024-11-29 19:21:51.516111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.655 #25 NEW cov: 12561 ft: 13901 corp: 6/213b lim: 105 exec/s: 0 rss: 72Mb L: 33/72 MS: 1 ChangeBinInt- 00:08:31.915 [2024-11-29 19:21:51.566332] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.915 [2024-11-29 19:21:51.566359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.915 #26 NEW cov: 12561 ft: 14020 corp: 7/250b lim: 105 exec/s: 0 rss: 72Mb L: 37/72 MS: 1 CrossOver- 00:08:31.915 [2024-11-29 19:21:51.636693] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.915 [2024-11-29 19:21:51.636725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.915 [2024-11-29 19:21:51.636872] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2821266740684990247 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.915 [2024-11-29 19:21:51.636899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.915 #32 NEW cov: 12561 ft: 14349 corp: 8/299b lim: 105 exec/s: 0 rss: 72Mb L: 49/72 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:08:31.915 [2024-11-29 19:21:51.706777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:14902075601831012302 len:52943 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.915 [2024-11-29 19:21:51.706810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.915 [2024-11-29 19:21:51.706934] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:14902075604643794638 len:52943 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.915 [2024-11-29 19:21:51.706953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.915 #35 NEW cov: 12561 ft: 14396 corp: 9/360b lim: 105 exec/s: 0 rss: 72Mb L: 61/72 MS: 3 EraseBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:31.915 [2024-11-29 19:21:51.756710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.915 [2024-11-29 19:21:51.756739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.915 #36 NEW cov: 12561 ft: 14410 corp: 10/401b lim: 105 exec/s: 0 rss: 72Mb L: 41/72 MS: 1 ChangeBit- 00:08:31.915 [2024-11-29 19:21:51.806973] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.915 [2024-11-29 19:21:51.806999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.174 #37 NEW cov: 12561 ft: 14459 corp: 11/442b lim: 105 exec/s: 0 rss: 72Mb L: 41/72 MS: 1 ShuffleBytes- 00:08:32.174 [2024-11-29 19:21:51.857072] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.174 [2024-11-29 19:21:51.857103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.174 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:32.174 #38 NEW cov: 12584 ft: 14589 corp: 12/483b lim: 105 exec/s: 0 rss: 72Mb L: 41/72 MS: 1 ShuffleBytes- 00:08:32.174 [2024-11-29 19:21:51.927617] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.175 [2024-11-29 19:21:51.927649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.175 [2024-11-29 19:21:51.927801] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2819577890824726311 len:10038 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.175 [2024-11-29 19:21:51.927827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.175 #39 NEW cov: 12584 ft: 14703 corp: 13/528b lim: 105 exec/s: 0 rss: 73Mb L: 45/72 MS: 1 CMP- DE: "5\256\012\377N \224\000"- 00:08:32.175 [2024-11-29 19:21:51.987604] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.175 [2024-11-29 19:21:51.987630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.175 #40 NEW cov: 12584 ft: 14800 corp: 14/561b lim: 105 exec/s: 40 rss: 73Mb L: 33/72 MS: 1 CMP- DE: "\376\377\377\377"- 00:08:32.175 [2024-11-29 19:21:52.037604] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.175 [2024-11-29 19:21:52.037636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.175 #41 NEW cov: 12584 ft: 14823 corp: 15/602b lim: 105 exec/s: 41 rss: 73Mb L: 41/72 MS: 1 ChangeBinInt- 00:08:32.434 [2024-11-29 19:21:52.087982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.434 [2024-11-29 19:21:52.088016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.434 [2024-11-29 19:21:52.088148] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2819577890824726311 len:10038 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.434 [2024-11-29 19:21:52.088167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.434 #42 NEW cov: 12584 ft: 14840 corp: 16/647b lim: 105 exec/s: 42 rss: 73Mb L: 45/72 MS: 1 CopyPart- 00:08:32.434 [2024-11-29 19:21:52.158013] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.434 [2024-11-29 19:21:52.158039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.434 #43 NEW cov: 12584 ft: 14878 corp: 17/686b lim: 105 exec/s: 43 rss: 73Mb L: 39/72 MS: 1 CrossOver- 00:08:32.434 [2024-11-29 19:21:52.208203] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198455847 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.434 [2024-11-29 19:21:52.208232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.434 #44 NEW cov: 12584 ft: 14885 corp: 18/719b lim: 105 exec/s: 44 rss: 73Mb L: 33/72 MS: 1 ChangeByte- 00:08:32.434 [2024-11-29 19:21:52.258932] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.434 [2024-11-29 19:21:52.258965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.434 [2024-11-29 19:21:52.259024] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:13165911456529954486 len:46775 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.434 [2024-11-29 19:21:52.259050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.434 [2024-11-29 19:21:52.259176] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:13165911456529954486 len:46775 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.434 [2024-11-29 19:21:52.259199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.434 [2024-11-29 19:21:52.259327] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:13165911456529954486 len:46775 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.434 [2024-11-29 19:21:52.259349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.434 #45 NEW cov: 12584 ft: 15418 corp: 19/823b lim: 105 exec/s: 45 rss: 73Mb L: 104/104 MS: 1 InsertRepeatedBytes- 00:08:32.434 [2024-11-29 19:21:52.328581] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740684990247 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.434 [2024-11-29 19:21:52.328618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.693 #48 NEW cov: 12584 ft: 15471 corp: 20/860b lim: 105 exec/s: 48 rss: 73Mb L: 37/104 MS: 3 ChangeByte-ChangeBit-CrossOver- 00:08:32.693 [2024-11-29 19:21:52.379259] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2531906049332683555 len:8996 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.693 [2024-11-29 19:21:52.379295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.693 [2024-11-29 19:21:52.379393] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:14685055086129564619 len:8996 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.693 [2024-11-29 19:21:52.379417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.693 [2024-11-29 19:21:52.379532] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:2531906049332683555 len:8996 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.693 [2024-11-29 19:21:52.379555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.693 [2024-11-29 19:21:52.379673] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:2531906049332683555 len:8996 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.693 [2024-11-29 19:21:52.379702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.693 #49 NEW cov: 12584 ft: 15485 corp: 21/944b lim: 105 exec/s: 49 rss: 73Mb L: 84/104 MS: 1 InsertRepeatedBytes- 00:08:32.693 [2024-11-29 19:21:52.449121] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.693 [2024-11-29 19:21:52.449157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.693 [2024-11-29 19:21:52.449281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.694 [2024-11-29 19:21:52.449307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.694 #50 NEW cov: 12584 ft: 15501 corp: 22/996b lim: 105 exec/s: 50 rss: 73Mb L: 52/104 MS: 1 InsertRepeatedBytes- 00:08:32.694 [2024-11-29 19:21:52.519124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.694 [2024-11-29 19:21:52.519171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.694 #51 NEW cov: 12584 ft: 15556 corp: 23/1037b lim: 105 exec/s: 51 rss: 73Mb L: 41/104 MS: 1 ChangeByte- 00:08:32.694 [2024-11-29 19:21:52.569640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.694 [2024-11-29 19:21:52.569676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.694 [2024-11-29 19:21:52.569783] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2821266740684990247 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.694 [2024-11-29 19:21:52.569810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.694 #52 NEW cov: 12584 ft: 15568 corp: 24/1089b lim: 105 exec/s: 52 rss: 73Mb L: 52/104 MS: 1 CopyPart- 00:08:32.954 [2024-11-29 19:21:52.619653] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.954 [2024-11-29 19:21:52.619688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.954 [2024-11-29 19:21:52.619819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2819577890824726309 len:10038 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.954 [2024-11-29 19:21:52.619844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.954 #53 NEW cov: 12584 ft: 15572 corp: 25/1134b lim: 105 exec/s: 53 rss: 73Mb L: 45/104 MS: 1 ChangeByte- 00:08:32.954 [2024-11-29 19:21:52.689606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.954 [2024-11-29 19:21:52.689639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.954 #54 NEW cov: 12584 ft: 15581 corp: 26/1175b lim: 105 exec/s: 54 rss: 73Mb L: 41/104 MS: 1 ChangeByte- 00:08:32.954 [2024-11-29 19:21:52.759795] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.954 [2024-11-29 19:21:52.759826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.954 #55 NEW cov: 12584 ft: 15620 corp: 27/1216b lim: 105 exec/s: 55 rss: 73Mb L: 41/104 MS: 1 ChangeByte- 00:08:32.954 [2024-11-29 19:21:52.809996] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.954 [2024-11-29 19:21:52.810023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.954 #56 NEW cov: 12584 ft: 15625 corp: 28/1249b lim: 105 exec/s: 56 rss: 73Mb L: 33/104 MS: 1 ShuffleBytes- 00:08:32.954 [2024-11-29 19:21:52.860246] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16660348479232 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.954 [2024-11-29 19:21:52.860272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.214 #57 NEW cov: 12584 ft: 15639 corp: 29/1290b lim: 105 exec/s: 57 rss: 73Mb L: 41/104 MS: 1 CMP- DE: "\000\000\000\017"- 00:08:33.214 [2024-11-29 19:21:52.910461] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:14902075601831012302 len:52943 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.214 [2024-11-29 19:21:52.910486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.214 [2024-11-29 19:21:52.910618] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:14902075604643794638 len:52943 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.214 [2024-11-29 19:21:52.910641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.214 #58 NEW cov: 12584 ft: 15644 corp: 30/1351b lim: 105 exec/s: 58 rss: 73Mb L: 61/104 MS: 1 ChangeBinInt- 00:08:33.214 [2024-11-29 19:21:52.980745] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740198450983 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.214 [2024-11-29 19:21:52.980778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.214 [2024-11-29 19:21:52.980893] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2821266740686694183 len:577 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.214 [2024-11-29 19:21:52.980919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.214 #59 NEW cov: 12584 ft: 15656 corp: 31/1393b lim: 105 exec/s: 29 rss: 73Mb L: 42/104 MS: 1 InsertByte- 00:08:33.214 #59 DONE cov: 12584 ft: 15656 corp: 31/1393b lim: 105 exec/s: 29 rss: 73Mb 00:08:33.214 ###### Recommended dictionary. ###### 00:08:33.214 "\002\000\000\000\000\000\000\000" # Uses: 2 00:08:33.214 "5\256\012\377N \224\000" # Uses: 0 00:08:33.214 "\376\377\377\377" # Uses: 0 00:08:33.214 "\000\000\000\017" # Uses: 0 00:08:33.214 ###### End of recommended dictionary. ###### 00:08:33.214 Done 59 runs in 2 second(s) 00:08:33.214 19:21:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_16.conf /var/tmp/suppress_nvmf_fuzz 00:08:33.473 19:21:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:33.473 19:21:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:33.473 19:21:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:33.473 19:21:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:33.473 19:21:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:33.473 19:21:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:33.473 19:21:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:33.473 19:21:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:33.473 19:21:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:33.473 19:21:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:33.473 19:21:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 17 00:08:33.473 19:21:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4417 00:08:33.473 19:21:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:33.473 19:21:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:33.473 19:21:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:33.473 19:21:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:33.473 19:21:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:33.473 19:21:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 00:08:33.473 [2024-11-29 19:21:53.169556] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:33.473 [2024-11-29 19:21:53.169651] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1753362 ] 00:08:33.473 [2024-11-29 19:21:53.352419] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.473 [2024-11-29 19:21:53.364754] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.732 [2024-11-29 19:21:53.417854] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:33.732 [2024-11-29 19:21:53.434190] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:33.732 INFO: Running with entropic power schedule (0xFF, 100). 00:08:33.732 INFO: Seed: 1106513017 00:08:33.732 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:08:33.732 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:08:33.732 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:33.732 INFO: A corpus is not provided, starting from an empty corpus 00:08:33.732 #2 INITED exec/s: 0 rss: 64Mb 00:08:33.732 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:33.732 This may also happen if the target rejected all inputs we tried so far 00:08:33.732 [2024-11-29 19:21:53.479720] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.732 [2024-11-29 19:21:53.479750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.732 [2024-11-29 19:21:53.479787] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.732 [2024-11-29 19:21:53.479803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.732 [2024-11-29 19:21:53.479858] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.732 [2024-11-29 19:21:53.479872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.991 NEW_FUNC[1/718]: 0x473508 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:33.991 NEW_FUNC[2/718]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:33.991 #3 NEW cov: 12379 ft: 12372 corp: 2/88b lim: 120 exec/s: 0 rss: 72Mb L: 87/87 MS: 1 InsertRepeatedBytes- 00:08:33.991 [2024-11-29 19:21:53.810539] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.991 [2024-11-29 19:21:53.810573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.991 [2024-11-29 19:21:53.810610] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.991 [2024-11-29 19:21:53.810645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.991 [2024-11-29 19:21:53.810699] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.991 [2024-11-29 19:21:53.810714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.991 #4 NEW cov: 12492 ft: 13023 corp: 3/176b lim: 120 exec/s: 0 rss: 72Mb L: 88/88 MS: 1 InsertByte- 00:08:33.991 [2024-11-29 19:21:53.870663] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.991 [2024-11-29 19:21:53.870691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.991 [2024-11-29 19:21:53.870730] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.991 [2024-11-29 19:21:53.870744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.991 [2024-11-29 19:21:53.870798] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.991 [2024-11-29 19:21:53.870813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.249 #5 NEW cov: 12498 ft: 13201 corp: 4/264b lim: 120 exec/s: 0 rss: 72Mb L: 88/88 MS: 1 ChangeBit- 00:08:34.249 [2024-11-29 19:21:53.930798] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.249 [2024-11-29 19:21:53.930825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.249 [2024-11-29 19:21:53.930863] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.249 [2024-11-29 19:21:53.930879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.249 [2024-11-29 19:21:53.930933] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.249 [2024-11-29 19:21:53.930949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.249 #6 NEW cov: 12583 ft: 13593 corp: 5/352b lim: 120 exec/s: 0 rss: 72Mb L: 88/88 MS: 1 CrossOver- 00:08:34.249 [2024-11-29 19:21:53.970914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.249 [2024-11-29 19:21:53.970942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.249 [2024-11-29 19:21:53.970979] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.249 [2024-11-29 19:21:53.970996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.249 [2024-11-29 19:21:53.971049] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.249 [2024-11-29 19:21:53.971063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.249 #7 NEW cov: 12583 ft: 13728 corp: 6/441b lim: 120 exec/s: 0 rss: 72Mb L: 89/89 MS: 1 CrossOver- 00:08:34.249 [2024-11-29 19:21:54.011012] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629415060757 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.249 [2024-11-29 19:21:54.011038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.249 [2024-11-29 19:21:54.011075] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.249 [2024-11-29 19:21:54.011092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.249 [2024-11-29 19:21:54.011142] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.249 [2024-11-29 19:21:54.011157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.249 #8 NEW cov: 12583 ft: 13781 corp: 7/530b lim: 120 exec/s: 0 rss: 72Mb L: 89/89 MS: 1 CrossOver- 00:08:34.249 [2024-11-29 19:21:54.051098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629415060757 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.249 [2024-11-29 19:21:54.051125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.249 [2024-11-29 19:21:54.051167] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16927600440522435562 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.249 [2024-11-29 19:21:54.051183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.249 [2024-11-29 19:21:54.051236] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.249 [2024-11-29 19:21:54.051252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.250 #9 NEW cov: 12583 ft: 13841 corp: 8/619b lim: 120 exec/s: 0 rss: 72Mb L: 89/89 MS: 1 ChangeBinInt- 00:08:34.250 [2024-11-29 19:21:54.111308] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.250 [2024-11-29 19:21:54.111336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.250 [2024-11-29 19:21:54.111371] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.250 [2024-11-29 19:21:54.111387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.250 [2024-11-29 19:21:54.111439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.250 [2024-11-29 19:21:54.111455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.250 #10 NEW cov: 12583 ft: 13875 corp: 9/707b lim: 120 exec/s: 0 rss: 72Mb L: 88/89 MS: 1 ShuffleBytes- 00:08:34.507 [2024-11-29 19:21:54.171351] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.507 [2024-11-29 19:21:54.171379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.507 [2024-11-29 19:21:54.171416] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143582354969877 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.507 [2024-11-29 19:21:54.171433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.507 #11 NEW cov: 12583 ft: 14253 corp: 10/762b lim: 120 exec/s: 0 rss: 73Mb L: 55/89 MS: 1 EraseBytes- 00:08:34.508 [2024-11-29 19:21:54.211424] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:366222124992763157 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.508 [2024-11-29 19:21:54.211452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.508 [2024-11-29 19:21:54.211497] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143582354969877 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.508 [2024-11-29 19:21:54.211514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.508 #12 NEW cov: 12583 ft: 14304 corp: 11/817b lim: 120 exec/s: 0 rss: 73Mb L: 55/89 MS: 1 ChangeBit- 00:08:34.508 [2024-11-29 19:21:54.271749] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:366222124992763157 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.508 [2024-11-29 19:21:54.271776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.508 [2024-11-29 19:21:54.271823] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.508 [2024-11-29 19:21:54.271839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.508 [2024-11-29 19:21:54.271891] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1516047404855792917 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.508 [2024-11-29 19:21:54.271906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.508 #13 NEW cov: 12583 ft: 14326 corp: 12/907b lim: 120 exec/s: 0 rss: 73Mb L: 90/90 MS: 1 CrossOver- 00:08:34.508 [2024-11-29 19:21:54.331938] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.508 [2024-11-29 19:21:54.331965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.508 [2024-11-29 19:21:54.332004] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.508 [2024-11-29 19:21:54.332020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.508 [2024-11-29 19:21:54.332075] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.508 [2024-11-29 19:21:54.332092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.508 #14 NEW cov: 12583 ft: 14364 corp: 13/995b lim: 120 exec/s: 0 rss: 73Mb L: 88/90 MS: 1 ChangeBinInt- 00:08:34.508 [2024-11-29 19:21:54.371989] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:366222124992763157 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.508 [2024-11-29 19:21:54.372017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.508 [2024-11-29 19:21:54.372056] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.508 [2024-11-29 19:21:54.372072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.508 [2024-11-29 19:21:54.372124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1516047404855792917 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.508 [2024-11-29 19:21:54.372140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.767 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:34.767 #15 NEW cov: 12606 ft: 14430 corp: 14/1085b lim: 120 exec/s: 0 rss: 73Mb L: 90/90 MS: 1 ChangeBit- 00:08:34.767 [2024-11-29 19:21:54.432169] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.767 [2024-11-29 19:21:54.432196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.767 [2024-11-29 19:21:54.432234] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.767 [2024-11-29 19:21:54.432250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.767 [2024-11-29 19:21:54.432303] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.767 [2024-11-29 19:21:54.432320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.767 #16 NEW cov: 12606 ft: 14440 corp: 15/1173b lim: 120 exec/s: 0 rss: 73Mb L: 88/90 MS: 1 InsertByte- 00:08:34.767 [2024-11-29 19:21:54.472252] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629415060757 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.767 [2024-11-29 19:21:54.472280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.767 [2024-11-29 19:21:54.472318] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16927600440522435562 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.767 [2024-11-29 19:21:54.472333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.767 [2024-11-29 19:21:54.472388] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.767 [2024-11-29 19:21:54.472404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.767 #17 NEW cov: 12606 ft: 14467 corp: 16/1263b lim: 120 exec/s: 17 rss: 73Mb L: 90/90 MS: 1 InsertByte- 00:08:34.767 [2024-11-29 19:21:54.532604] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:366222124992763157 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.767 [2024-11-29 19:21:54.532632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.767 [2024-11-29 19:21:54.532684] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.767 [2024-11-29 19:21:54.532700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.767 [2024-11-29 19:21:54.532753] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1521395429413295381 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.767 [2024-11-29 19:21:54.532770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.767 [2024-11-29 19:21:54.532822] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.767 [2024-11-29 19:21:54.532839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.767 #18 NEW cov: 12606 ft: 14885 corp: 17/1368b lim: 120 exec/s: 18 rss: 73Mb L: 105/105 MS: 1 CrossOver- 00:08:34.767 [2024-11-29 19:21:54.592628] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.767 [2024-11-29 19:21:54.592656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.767 [2024-11-29 19:21:54.592701] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.767 [2024-11-29 19:21:54.592717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.767 [2024-11-29 19:21:54.592767] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.767 [2024-11-29 19:21:54.592782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.767 #19 NEW cov: 12606 ft: 14957 corp: 18/1457b lim: 120 exec/s: 19 rss: 73Mb L: 89/105 MS: 1 InsertByte- 00:08:34.767 [2024-11-29 19:21:54.652669] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629415060757 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.767 [2024-11-29 19:21:54.652697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.767 [2024-11-29 19:21:54.652745] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16927600440522435562 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.767 [2024-11-29 19:21:54.652762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.026 #20 NEW cov: 12606 ft: 14977 corp: 19/1509b lim: 120 exec/s: 20 rss: 73Mb L: 52/105 MS: 1 EraseBytes- 00:08:35.026 [2024-11-29 19:21:54.712915] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629415060757 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.026 [2024-11-29 19:21:54.712941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.026 [2024-11-29 19:21:54.712976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16927600440522435562 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.026 [2024-11-29 19:21:54.712991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.026 [2024-11-29 19:21:54.713045] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.026 [2024-11-29 19:21:54.713061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.026 #21 NEW cov: 12606 ft: 15004 corp: 20/1598b lim: 120 exec/s: 21 rss: 73Mb L: 89/105 MS: 1 ChangeByte- 00:08:35.026 [2024-11-29 19:21:54.753048] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.026 [2024-11-29 19:21:54.753076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.026 [2024-11-29 19:21:54.753115] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.026 [2024-11-29 19:21:54.753132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.026 [2024-11-29 19:21:54.753185] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.026 [2024-11-29 19:21:54.753200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.026 #22 NEW cov: 12606 ft: 15022 corp: 21/1688b lim: 120 exec/s: 22 rss: 73Mb L: 90/105 MS: 1 InsertByte- 00:08:35.026 [2024-11-29 19:21:54.813199] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:366222124992763157 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.026 [2024-11-29 19:21:54.813226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.026 [2024-11-29 19:21:54.813271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.026 [2024-11-29 19:21:54.813287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.026 [2024-11-29 19:21:54.813342] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1516047404855792917 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.026 [2024-11-29 19:21:54.813358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.026 #23 NEW cov: 12606 ft: 15054 corp: 22/1778b lim: 120 exec/s: 23 rss: 74Mb L: 90/105 MS: 1 ChangeBinInt- 00:08:35.026 [2024-11-29 19:21:54.853177] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.026 [2024-11-29 19:21:54.853203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.026 [2024-11-29 19:21:54.853236] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143582354969877 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.026 [2024-11-29 19:21:54.853254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.026 #24 NEW cov: 12606 ft: 15091 corp: 23/1829b lim: 120 exec/s: 24 rss: 74Mb L: 51/105 MS: 1 EraseBytes- 00:08:35.026 [2024-11-29 19:21:54.893564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:366222124992763157 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.026 [2024-11-29 19:21:54.893591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.026 [2024-11-29 19:21:54.893649] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.026 [2024-11-29 19:21:54.893662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.026 [2024-11-29 19:21:54.893714] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1516047404855792917 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.026 [2024-11-29 19:21:54.893728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.026 [2024-11-29 19:21:54.893783] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:1519372143334593813 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.026 [2024-11-29 19:21:54.893799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.286 #25 NEW cov: 12606 ft: 15103 corp: 24/1928b lim: 120 exec/s: 25 rss: 74Mb L: 99/105 MS: 1 InsertRepeatedBytes- 00:08:35.286 [2024-11-29 19:21:54.953738] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:366222124992763157 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.286 [2024-11-29 19:21:54.953766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.286 [2024-11-29 19:21:54.953817] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.286 [2024-11-29 19:21:54.953835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.286 [2024-11-29 19:21:54.953889] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:9331882294292845953 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.286 [2024-11-29 19:21:54.953905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.286 [2024-11-29 19:21:54.953958] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:9331881832255422849 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.286 [2024-11-29 19:21:54.953972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.286 #26 NEW cov: 12606 ft: 15123 corp: 25/2046b lim: 120 exec/s: 26 rss: 74Mb L: 118/118 MS: 1 InsertRepeatedBytes- 00:08:35.286 [2024-11-29 19:21:54.993561] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629415060757 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.286 [2024-11-29 19:21:54.993588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.286 [2024-11-29 19:21:54.993630] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16927600440522435562 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.286 [2024-11-29 19:21:54.993646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.286 #27 NEW cov: 12606 ft: 15142 corp: 26/2098b lim: 120 exec/s: 27 rss: 74Mb L: 52/118 MS: 1 CrossOver- 00:08:35.286 [2024-11-29 19:21:55.053769] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.286 [2024-11-29 19:21:55.053796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.286 [2024-11-29 19:21:55.053848] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.286 [2024-11-29 19:21:55.053863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.286 #28 NEW cov: 12606 ft: 15173 corp: 27/2167b lim: 120 exec/s: 28 rss: 74Mb L: 69/118 MS: 1 InsertRepeatedBytes- 00:08:35.286 [2024-11-29 19:21:55.093850] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629415060757 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.286 [2024-11-29 19:21:55.093877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.286 [2024-11-29 19:21:55.093918] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16927600440522435562 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.286 [2024-11-29 19:21:55.093933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.286 #29 NEW cov: 12606 ft: 15187 corp: 28/2219b lim: 120 exec/s: 29 rss: 74Mb L: 52/118 MS: 1 ChangeBinInt- 00:08:35.286 [2024-11-29 19:21:55.154328] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:366222124992763157 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.286 [2024-11-29 19:21:55.154356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.286 [2024-11-29 19:21:55.154408] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.286 [2024-11-29 19:21:55.154435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.286 [2024-11-29 19:21:55.154489] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1519152425692632341 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.286 [2024-11-29 19:21:55.154508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.286 [2024-11-29 19:21:55.154562] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:1519143629599610133 len:2582 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.286 [2024-11-29 19:21:55.154579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.544 #30 NEW cov: 12606 ft: 15191 corp: 29/2325b lim: 120 exec/s: 30 rss: 74Mb L: 106/118 MS: 1 InsertByte- 00:08:35.544 [2024-11-29 19:21:55.214487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.544 [2024-11-29 19:21:55.214514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.544 [2024-11-29 19:21:55.214563] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.544 [2024-11-29 19:21:55.214579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.544 [2024-11-29 19:21:55.214631] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1519143629599610133 len:5632 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.544 [2024-11-29 19:21:55.214648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.544 [2024-11-29 19:21:55.214700] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.544 [2024-11-29 19:21:55.214715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.544 #31 NEW cov: 12606 ft: 15227 corp: 30/2435b lim: 120 exec/s: 31 rss: 74Mb L: 110/118 MS: 1 InsertRepeatedBytes- 00:08:35.544 [2024-11-29 19:21:55.274506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.544 [2024-11-29 19:21:55.274535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.544 [2024-11-29 19:21:55.274573] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.544 [2024-11-29 19:21:55.274589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.544 [2024-11-29 19:21:55.274647] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:726510095182402837 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.544 [2024-11-29 19:21:55.274664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.544 #32 NEW cov: 12606 ft: 15245 corp: 31/2509b lim: 120 exec/s: 32 rss: 74Mb L: 74/118 MS: 1 EraseBytes- 00:08:35.544 [2024-11-29 19:21:55.334887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:366222124992763157 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.544 [2024-11-29 19:21:55.334916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.544 [2024-11-29 19:21:55.334961] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.544 [2024-11-29 19:21:55.334977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.544 [2024-11-29 19:21:55.335030] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1519152425692632341 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.544 [2024-11-29 19:21:55.335050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.544 [2024-11-29 19:21:55.335102] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:1519143629599610133 len:2582 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.544 [2024-11-29 19:21:55.335116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.544 #38 NEW cov: 12606 ft: 15254 corp: 32/2615b lim: 120 exec/s: 38 rss: 74Mb L: 106/118 MS: 1 CopyPart- 00:08:35.544 [2024-11-29 19:21:55.394884] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:366222133582697749 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.544 [2024-11-29 19:21:55.394912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.544 [2024-11-29 19:21:55.394947] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.544 [2024-11-29 19:21:55.394964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.544 [2024-11-29 19:21:55.395017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1516047404855792917 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.544 [2024-11-29 19:21:55.395034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.544 #39 NEW cov: 12606 ft: 15257 corp: 33/2705b lim: 120 exec/s: 39 rss: 74Mb L: 90/118 MS: 1 ChangeBit- 00:08:35.544 [2024-11-29 19:21:55.435004] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:366222124992763157 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.544 [2024-11-29 19:21:55.435033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.544 [2024-11-29 19:21:55.435071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1519143629599610133 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.544 [2024-11-29 19:21:55.435089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.544 [2024-11-29 19:21:55.435142] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1516047404855792917 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.544 [2024-11-29 19:21:55.435157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.802 #40 NEW cov: 12606 ft: 15313 corp: 34/2795b lim: 120 exec/s: 40 rss: 74Mb L: 90/118 MS: 1 ShuffleBytes- 00:08:35.802 [2024-11-29 19:21:55.474922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1519143629415060757 len:5398 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.802 [2024-11-29 19:21:55.474951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.802 [2024-11-29 19:21:55.474993] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16927600440522435562 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.802 [2024-11-29 19:21:55.475008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.802 #41 NEW cov: 12606 ft: 15341 corp: 35/2847b lim: 120 exec/s: 20 rss: 74Mb L: 52/118 MS: 1 ChangeBinInt- 00:08:35.802 #41 DONE cov: 12606 ft: 15341 corp: 35/2847b lim: 120 exec/s: 20 rss: 74Mb 00:08:35.802 Done 41 runs in 2 second(s) 00:08:35.802 19:21:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_17.conf /var/tmp/suppress_nvmf_fuzz 00:08:35.803 19:21:55 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:35.803 19:21:55 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:35.803 19:21:55 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:35.803 19:21:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:35.803 19:21:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:35.803 19:21:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:35.803 19:21:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:35.803 19:21:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:35.803 19:21:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:35.803 19:21:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:35.803 19:21:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 18 00:08:35.803 19:21:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4418 00:08:35.803 19:21:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:35.803 19:21:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:35.803 19:21:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:35.803 19:21:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:35.803 19:21:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:35.803 19:21:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 00:08:35.803 [2024-11-29 19:21:55.669194] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:35.803 [2024-11-29 19:21:55.669270] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1753788 ] 00:08:36.061 [2024-11-29 19:21:55.861666] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.061 [2024-11-29 19:21:55.874151] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.061 [2024-11-29 19:21:55.926847] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:36.061 [2024-11-29 19:21:55.943293] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:36.061 INFO: Running with entropic power schedule (0xFF, 100). 00:08:36.061 INFO: Seed: 3615516146 00:08:36.320 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:08:36.320 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:08:36.320 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:36.320 INFO: A corpus is not provided, starting from an empty corpus 00:08:36.320 #2 INITED exec/s: 0 rss: 64Mb 00:08:36.320 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:36.320 This may also happen if the target rejected all inputs we tried so far 00:08:36.320 [2024-11-29 19:21:55.987884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:36.320 [2024-11-29 19:21:55.987917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.579 NEW_FUNC[1/710]: 0x476df8 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:36.579 NEW_FUNC[2/710]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:36.579 #7 NEW cov: 12271 ft: 12314 corp: 2/25b lim: 100 exec/s: 0 rss: 72Mb L: 24/24 MS: 5 ShuffleBytes-CopyPart-CrossOver-ChangeBinInt-InsertRepeatedBytes- 00:08:36.579 [2024-11-29 19:21:56.338805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:36.579 [2024-11-29 19:21:56.338842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.579 NEW_FUNC[1/6]: 0x1011a08 in rte_get_tsc_cycles /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/rte_cycles.h:61 00:08:36.579 NEW_FUNC[2/6]: 0x1011a78 in rte_rdtsc /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/rte_cycles.h:31 00:08:36.579 #8 NEW cov: 12435 ft: 12918 corp: 3/49b lim: 100 exec/s: 0 rss: 72Mb L: 24/24 MS: 1 CrossOver- 00:08:36.579 [2024-11-29 19:21:56.428894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:36.579 [2024-11-29 19:21:56.428922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.579 #9 NEW cov: 12441 ft: 13212 corp: 4/74b lim: 100 exec/s: 0 rss: 72Mb L: 25/25 MS: 1 InsertByte- 00:08:36.579 [2024-11-29 19:21:56.479020] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:36.579 [2024-11-29 19:21:56.479049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.838 #10 NEW cov: 12526 ft: 13508 corp: 5/98b lim: 100 exec/s: 0 rss: 72Mb L: 24/25 MS: 1 ChangeBinInt- 00:08:36.838 [2024-11-29 19:21:56.539166] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:36.838 [2024-11-29 19:21:56.539196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.838 #11 NEW cov: 12526 ft: 13566 corp: 6/123b lim: 100 exec/s: 0 rss: 72Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:36.838 [2024-11-29 19:21:56.629446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:36.839 [2024-11-29 19:21:56.629476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.839 #12 NEW cov: 12526 ft: 13682 corp: 7/148b lim: 100 exec/s: 0 rss: 72Mb L: 25/25 MS: 1 InsertByte- 00:08:36.839 [2024-11-29 19:21:56.719682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:36.839 [2024-11-29 19:21:56.719712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.097 #13 NEW cov: 12526 ft: 13750 corp: 8/173b lim: 100 exec/s: 0 rss: 73Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:37.097 [2024-11-29 19:21:56.809951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.097 [2024-11-29 19:21:56.809981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.097 [2024-11-29 19:21:56.810014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:37.097 [2024-11-29 19:21:56.810030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.097 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:37.097 #18 NEW cov: 12543 ft: 14138 corp: 9/224b lim: 100 exec/s: 0 rss: 73Mb L: 51/51 MS: 5 EraseBytes-ChangeBit-ChangeByte-InsertByte-InsertRepeatedBytes- 00:08:37.097 [2024-11-29 19:21:56.900153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.097 [2024-11-29 19:21:56.900183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.097 #19 NEW cov: 12543 ft: 14221 corp: 10/248b lim: 100 exec/s: 0 rss: 73Mb L: 24/51 MS: 1 ChangeByte- 00:08:37.097 [2024-11-29 19:21:56.950251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.097 [2024-11-29 19:21:56.950286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.097 #20 NEW cov: 12543 ft: 14252 corp: 11/274b lim: 100 exec/s: 20 rss: 73Mb L: 26/51 MS: 1 InsertByte- 00:08:37.356 [2024-11-29 19:21:57.010397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.356 [2024-11-29 19:21:57.010427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.356 #21 NEW cov: 12543 ft: 14398 corp: 12/298b lim: 100 exec/s: 21 rss: 73Mb L: 24/51 MS: 1 CrossOver- 00:08:37.356 [2024-11-29 19:21:57.100629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.356 [2024-11-29 19:21:57.100658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.356 #22 NEW cov: 12543 ft: 14438 corp: 13/323b lim: 100 exec/s: 22 rss: 73Mb L: 25/51 MS: 1 ChangeBinInt- 00:08:37.356 [2024-11-29 19:21:57.190907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.356 [2024-11-29 19:21:57.190937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.615 #23 NEW cov: 12543 ft: 14483 corp: 14/351b lim: 100 exec/s: 23 rss: 73Mb L: 28/51 MS: 1 CopyPart- 00:08:37.615 [2024-11-29 19:21:57.281109] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.615 [2024-11-29 19:21:57.281138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.615 #24 NEW cov: 12543 ft: 14512 corp: 15/376b lim: 100 exec/s: 24 rss: 73Mb L: 25/51 MS: 1 ChangeByte- 00:08:37.615 [2024-11-29 19:21:57.331250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.615 [2024-11-29 19:21:57.331280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.615 #25 NEW cov: 12543 ft: 14576 corp: 16/400b lim: 100 exec/s: 25 rss: 73Mb L: 24/51 MS: 1 ShuffleBytes- 00:08:37.615 [2024-11-29 19:21:57.391353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.615 [2024-11-29 19:21:57.391381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.615 #26 NEW cov: 12543 ft: 14611 corp: 17/433b lim: 100 exec/s: 26 rss: 73Mb L: 33/51 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:37.615 [2024-11-29 19:21:57.481657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.615 [2024-11-29 19:21:57.481686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.615 #27 NEW cov: 12543 ft: 14647 corp: 18/458b lim: 100 exec/s: 27 rss: 73Mb L: 25/51 MS: 1 InsertByte- 00:08:37.874 [2024-11-29 19:21:57.531812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.874 [2024-11-29 19:21:57.531841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.874 [2024-11-29 19:21:57.531889] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:37.874 [2024-11-29 19:21:57.531912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.874 #28 NEW cov: 12543 ft: 14683 corp: 19/509b lim: 100 exec/s: 28 rss: 73Mb L: 51/51 MS: 1 ChangeBit- 00:08:37.874 [2024-11-29 19:21:57.622023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.874 [2024-11-29 19:21:57.622051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.874 #29 NEW cov: 12543 ft: 14702 corp: 20/533b lim: 100 exec/s: 29 rss: 73Mb L: 24/51 MS: 1 ShuffleBytes- 00:08:37.874 [2024-11-29 19:21:57.712249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.874 [2024-11-29 19:21:57.712277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.874 #30 NEW cov: 12543 ft: 14708 corp: 21/566b lim: 100 exec/s: 30 rss: 73Mb L: 33/51 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:37.874 [2024-11-29 19:21:57.762357] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.874 [2024-11-29 19:21:57.762385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.874 [2024-11-29 19:21:57.762433] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:37.874 [2024-11-29 19:21:57.762456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.132 #31 NEW cov: 12543 ft: 14732 corp: 22/620b lim: 100 exec/s: 31 rss: 73Mb L: 54/54 MS: 1 InsertRepeatedBytes- 00:08:38.132 [2024-11-29 19:21:57.822605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.132 [2024-11-29 19:21:57.822634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.132 [2024-11-29 19:21:57.822680] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.132 [2024-11-29 19:21:57.822705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.132 [2024-11-29 19:21:57.822734] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.132 [2024-11-29 19:21:57.822749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.132 [2024-11-29 19:21:57.822777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.132 [2024-11-29 19:21:57.822791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.132 #32 NEW cov: 12543 ft: 15089 corp: 23/710b lim: 100 exec/s: 32 rss: 73Mb L: 90/90 MS: 1 InsertRepeatedBytes- 00:08:38.132 [2024-11-29 19:21:57.882715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.132 [2024-11-29 19:21:57.882744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.132 [2024-11-29 19:21:57.882777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.132 [2024-11-29 19:21:57.882792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.132 #33 NEW cov: 12550 ft: 15147 corp: 24/764b lim: 100 exec/s: 33 rss: 73Mb L: 54/90 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:38.132 [2024-11-29 19:21:57.972966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.132 [2024-11-29 19:21:57.972994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.132 [2024-11-29 19:21:57.973026] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.132 [2024-11-29 19:21:57.973042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.391 #34 NEW cov: 12550 ft: 15227 corp: 25/816b lim: 100 exec/s: 17 rss: 74Mb L: 52/90 MS: 1 InsertByte- 00:08:38.391 #34 DONE cov: 12550 ft: 15227 corp: 25/816b lim: 100 exec/s: 17 rss: 74Mb 00:08:38.391 ###### Recommended dictionary. ###### 00:08:38.391 "\377\377\377\377\377\377\377\377" # Uses: 2 00:08:38.391 ###### End of recommended dictionary. ###### 00:08:38.391 Done 34 runs in 2 second(s) 00:08:38.391 19:21:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_18.conf /var/tmp/suppress_nvmf_fuzz 00:08:38.391 19:21:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:38.391 19:21:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:38.391 19:21:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:38.391 19:21:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:38.391 19:21:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:38.391 19:21:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:38.391 19:21:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:38.391 19:21:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:38.391 19:21:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:38.391 19:21:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:38.391 19:21:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 19 00:08:38.391 19:21:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4419 00:08:38.391 19:21:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:38.391 19:21:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:38.391 19:21:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:38.391 19:21:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:38.391 19:21:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:38.391 19:21:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 00:08:38.391 [2024-11-29 19:21:58.189473] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:38.392 [2024-11-29 19:21:58.189548] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1754186 ] 00:08:38.650 [2024-11-29 19:21:58.390943] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:38.650 [2024-11-29 19:21:58.403296] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.650 [2024-11-29 19:21:58.455899] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:38.650 [2024-11-29 19:21:58.472260] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:38.650 INFO: Running with entropic power schedule (0xFF, 100). 00:08:38.650 INFO: Seed: 1848598672 00:08:38.650 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:08:38.650 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:08:38.650 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:38.650 INFO: A corpus is not provided, starting from an empty corpus 00:08:38.650 #2 INITED exec/s: 0 rss: 64Mb 00:08:38.650 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:38.650 This may also happen if the target rejected all inputs we tried so far 00:08:38.650 [2024-11-29 19:21:58.517539] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069666242559 len:65536 00:08:38.650 [2024-11-29 19:21:58.517571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.650 [2024-11-29 19:21:58.517630] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446462603027808255 len:1 00:08:38.650 [2024-11-29 19:21:58.517647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.167 NEW_FUNC[1/716]: 0x479db8 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:39.168 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:39.168 #6 NEW cov: 12300 ft: 12299 corp: 2/21b lim: 50 exec/s: 0 rss: 71Mb L: 20/20 MS: 4 ShuffleBytes-ChangeBit-CMP-InsertRepeatedBytes- DE: "\000\000\000\000"- 00:08:39.168 [2024-11-29 19:21:58.848329] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069666242559 len:65536 00:08:39.168 [2024-11-29 19:21:58.848364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.168 [2024-11-29 19:21:58.848420] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446462603027808035 len:1 00:08:39.168 [2024-11-29 19:21:58.848437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.168 #12 NEW cov: 12413 ft: 12803 corp: 3/41b lim: 50 exec/s: 0 rss: 71Mb L: 20/20 MS: 1 ChangeByte- 00:08:39.168 [2024-11-29 19:21:58.908398] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069666242559 len:65536 00:08:39.168 [2024-11-29 19:21:58.908426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.168 [2024-11-29 19:21:58.908461] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551395 len:65536 00:08:39.168 [2024-11-29 19:21:58.908476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.168 #18 NEW cov: 12419 ft: 13009 corp: 4/61b lim: 50 exec/s: 0 rss: 71Mb L: 20/20 MS: 1 CrossOver- 00:08:39.168 [2024-11-29 19:21:58.968559] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744072786804735 len:65536 00:08:39.168 [2024-11-29 19:21:58.968587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.168 [2024-11-29 19:21:58.968641] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446462603027808035 len:1 00:08:39.168 [2024-11-29 19:21:58.968658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.168 #24 NEW cov: 12504 ft: 13395 corp: 5/81b lim: 50 exec/s: 0 rss: 71Mb L: 20/20 MS: 1 ChangeByte- 00:08:39.168 [2024-11-29 19:21:59.008540] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744072786804735 len:65536 00:08:39.168 [2024-11-29 19:21:59.008568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.168 #30 NEW cov: 12504 ft: 13770 corp: 6/100b lim: 50 exec/s: 0 rss: 71Mb L: 19/20 MS: 1 EraseBytes- 00:08:39.168 [2024-11-29 19:21:59.068814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744072786804735 len:65536 00:08:39.168 [2024-11-29 19:21:59.068842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.168 [2024-11-29 19:21:59.068893] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2594072288873676799 len:1 00:08:39.168 [2024-11-29 19:21:59.068910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.426 #31 NEW cov: 12504 ft: 13928 corp: 7/121b lim: 50 exec/s: 0 rss: 71Mb L: 21/21 MS: 1 InsertByte- 00:08:39.426 [2024-11-29 19:21:59.108811] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744072786804735 len:65536 00:08:39.426 [2024-11-29 19:21:59.108838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.426 #32 NEW cov: 12504 ft: 13975 corp: 8/140b lim: 50 exec/s: 0 rss: 72Mb L: 19/21 MS: 1 CopyPart- 00:08:39.426 [2024-11-29 19:21:59.169099] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744072770551807 len:65536 00:08:39.426 [2024-11-29 19:21:59.169127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.426 [2024-11-29 19:21:59.169176] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446462603027808035 len:1 00:08:39.426 [2024-11-29 19:21:59.169192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.426 #33 NEW cov: 12504 ft: 14024 corp: 9/160b lim: 50 exec/s: 0 rss: 72Mb L: 20/21 MS: 1 InsertByte- 00:08:39.426 [2024-11-29 19:21:59.209429] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:8029759186529156975 len:28528 00:08:39.426 [2024-11-29 19:21:59.209456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.426 [2024-11-29 19:21:59.209502] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:8029759185026510703 len:28528 00:08:39.426 [2024-11-29 19:21:59.209518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.426 [2024-11-29 19:21:59.209569] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:8029759803501801327 len:65536 00:08:39.426 [2024-11-29 19:21:59.209584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.426 [2024-11-29 19:21:59.209657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:12970366926827028479 len:9216 00:08:39.426 [2024-11-29 19:21:59.209684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.426 #34 NEW cov: 12504 ft: 14410 corp: 10/205b lim: 50 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:08:39.426 [2024-11-29 19:21:59.269414] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1095468318464 len:65536 00:08:39.426 [2024-11-29 19:21:59.269442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.427 [2024-11-29 19:21:59.269484] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446462603027808035 len:1 00:08:39.427 [2024-11-29 19:21:59.269499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.427 #35 NEW cov: 12504 ft: 14476 corp: 11/225b lim: 50 exec/s: 0 rss: 72Mb L: 20/45 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:39.427 [2024-11-29 19:21:59.309379] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1095468318464 len:65536 00:08:39.427 [2024-11-29 19:21:59.309406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.685 #36 NEW cov: 12504 ft: 14522 corp: 12/242b lim: 50 exec/s: 0 rss: 72Mb L: 17/45 MS: 1 EraseBytes- 00:08:39.685 [2024-11-29 19:21:59.369654] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744072786804735 len:65536 00:08:39.685 [2024-11-29 19:21:59.369681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.685 [2024-11-29 19:21:59.369733] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12970366922532061184 len:9216 00:08:39.685 [2024-11-29 19:21:59.369750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.685 #37 NEW cov: 12504 ft: 14566 corp: 13/267b lim: 50 exec/s: 0 rss: 72Mb L: 25/45 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:39.685 [2024-11-29 19:21:59.409651] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446742977570144255 len:1 00:08:39.685 [2024-11-29 19:21:59.409678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.685 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:39.685 #38 NEW cov: 12527 ft: 14644 corp: 14/286b lim: 50 exec/s: 0 rss: 72Mb L: 19/45 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:39.685 [2024-11-29 19:21:59.449901] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069666242559 len:65536 00:08:39.685 [2024-11-29 19:21:59.449929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.685 [2024-11-29 19:21:59.449981] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551395 len:65536 00:08:39.685 [2024-11-29 19:21:59.449999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.685 #39 NEW cov: 12527 ft: 14655 corp: 15/312b lim: 50 exec/s: 0 rss: 72Mb L: 26/45 MS: 1 CrossOver- 00:08:39.685 [2024-11-29 19:21:59.490006] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446585121516552191 len:28528 00:08:39.685 [2024-11-29 19:21:59.490034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.685 [2024-11-29 19:21:59.490069] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:8029759803501801327 len:65536 00:08:39.685 [2024-11-29 19:21:59.490085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.686 #40 NEW cov: 12527 ft: 14678 corp: 16/332b lim: 50 exec/s: 40 rss: 72Mb L: 20/45 MS: 1 CrossOver- 00:08:39.686 [2024-11-29 19:21:59.550050] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3372154880 len:3 00:08:39.686 [2024-11-29 19:21:59.550078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.686 #41 NEW cov: 12527 ft: 14721 corp: 17/351b lim: 50 exec/s: 41 rss: 72Mb L: 19/45 MS: 1 ChangeBinInt- 00:08:39.686 [2024-11-29 19:21:59.590233] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744072786804735 len:65536 00:08:39.686 [2024-11-29 19:21:59.590262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.945 #47 NEW cov: 12527 ft: 14749 corp: 18/370b lim: 50 exec/s: 47 rss: 72Mb L: 19/45 MS: 1 ShuffleBytes- 00:08:39.945 [2024-11-29 19:21:59.650346] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744072786804735 len:65536 00:08:39.945 [2024-11-29 19:21:59.650373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.945 #48 NEW cov: 12527 ft: 14814 corp: 19/389b lim: 50 exec/s: 48 rss: 72Mb L: 19/45 MS: 1 ChangeBinInt- 00:08:39.945 [2024-11-29 19:21:59.690651] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069666242559 len:65536 00:08:39.945 [2024-11-29 19:21:59.690678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.945 [2024-11-29 19:21:59.690715] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551395 len:65536 00:08:39.945 [2024-11-29 19:21:59.690730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.945 [2024-11-29 19:21:59.690781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4289921024 len:1025 00:08:39.945 [2024-11-29 19:21:59.690797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.945 #49 NEW cov: 12527 ft: 15087 corp: 20/423b lim: 50 exec/s: 49 rss: 72Mb L: 34/45 MS: 1 CMP- DE: "\000\000\000\000\000\000\004\000"- 00:08:39.945 [2024-11-29 19:21:59.750713] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3372154880 len:3 00:08:39.945 [2024-11-29 19:21:59.750741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.945 [2024-11-29 19:21:59.750781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446462603027808035 len:1 00:08:39.945 [2024-11-29 19:21:59.750798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.945 #50 NEW cov: 12527 ft: 15091 corp: 21/443b lim: 50 exec/s: 50 rss: 72Mb L: 20/45 MS: 1 InsertByte- 00:08:39.945 [2024-11-29 19:21:59.811001] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069666242559 len:62720 00:08:39.945 [2024-11-29 19:21:59.811027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.945 [2024-11-29 19:21:59.811062] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446743128816746495 len:65536 00:08:39.945 [2024-11-29 19:21:59.811077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.945 [2024-11-29 19:21:59.811129] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18425070504797798399 len:1 00:08:39.945 [2024-11-29 19:21:59.811144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.945 #51 NEW cov: 12527 ft: 15115 corp: 22/473b lim: 50 exec/s: 51 rss: 72Mb L: 30/45 MS: 1 CMP- DE: "\364\377\377\377"- 00:08:39.945 [2024-11-29 19:21:59.851036] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1996488704 len:1 00:08:39.945 [2024-11-29 19:21:59.851064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.945 [2024-11-29 19:21:59.851097] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:39.945 [2024-11-29 19:21:59.851113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.205 #53 NEW cov: 12527 ft: 15121 corp: 23/502b lim: 50 exec/s: 53 rss: 72Mb L: 29/45 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:40.205 [2024-11-29 19:21:59.891183] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069666242559 len:65536 00:08:40.205 [2024-11-29 19:21:59.891211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.205 [2024-11-29 19:21:59.891258] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9404222471055343490 len:33411 00:08:40.205 [2024-11-29 19:21:59.891274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.205 #54 NEW cov: 12527 ft: 15129 corp: 24/530b lim: 50 exec/s: 54 rss: 72Mb L: 28/45 MS: 1 InsertRepeatedBytes- 00:08:40.205 [2024-11-29 19:21:59.931220] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069666242559 len:65536 00:08:40.205 [2024-11-29 19:21:59.931247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.205 [2024-11-29 19:21:59.931282] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9404222471055343490 len:33411 00:08:40.205 [2024-11-29 19:21:59.931298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.205 #55 NEW cov: 12527 ft: 15140 corp: 25/559b lim: 50 exec/s: 55 rss: 73Mb L: 29/45 MS: 1 InsertByte- 00:08:40.205 [2024-11-29 19:21:59.991402] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069666242559 len:65536 00:08:40.205 [2024-11-29 19:21:59.991429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.205 [2024-11-29 19:21:59.991464] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4294967295 len:65536 00:08:40.205 [2024-11-29 19:21:59.991479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.205 #56 NEW cov: 12527 ft: 15162 corp: 26/587b lim: 50 exec/s: 56 rss: 73Mb L: 28/45 MS: 1 CrossOver- 00:08:40.205 [2024-11-29 19:22:00.051796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446463831640113151 len:65536 00:08:40.205 [2024-11-29 19:22:00.051835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.205 [2024-11-29 19:22:00.051900] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18411421725796335615 len:33411 00:08:40.205 [2024-11-29 19:22:00.051924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.205 [2024-11-29 19:22:00.051990] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744071604175395 len:65536 00:08:40.205 [2024-11-29 19:22:00.052014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.205 #57 NEW cov: 12527 ft: 15308 corp: 27/617b lim: 50 exec/s: 57 rss: 73Mb L: 30/45 MS: 1 CMP- DE: "\001\037"- 00:08:40.205 [2024-11-29 19:22:00.101797] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069666242559 len:65536 00:08:40.205 [2024-11-29 19:22:00.101830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.205 [2024-11-29 19:22:00.101880] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446462671747284771 len:1 00:08:40.205 [2024-11-29 19:22:00.101897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.464 #58 NEW cov: 12527 ft: 15312 corp: 28/637b lim: 50 exec/s: 58 rss: 73Mb L: 20/45 MS: 1 ChangeBit- 00:08:40.464 [2024-11-29 19:22:00.141993] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744072786804735 len:65536 00:08:40.464 [2024-11-29 19:22:00.142022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.464 [2024-11-29 19:22:00.142056] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2594072288873676799 len:1 00:08:40.464 [2024-11-29 19:22:00.142071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.464 [2024-11-29 19:22:00.142122] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:40.464 [2024-11-29 19:22:00.142139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.464 #59 NEW cov: 12527 ft: 15354 corp: 29/669b lim: 50 exec/s: 59 rss: 73Mb L: 32/45 MS: 1 InsertRepeatedBytes- 00:08:40.464 [2024-11-29 19:22:00.182237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:8029759186529156975 len:28528 00:08:40.464 [2024-11-29 19:22:00.182266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.464 [2024-11-29 19:22:00.182310] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:8029759185026510703 len:28528 00:08:40.464 [2024-11-29 19:22:00.182327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.464 [2024-11-29 19:22:00.182388] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446585741609758575 len:65536 00:08:40.464 [2024-11-29 19:22:00.182404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.464 [2024-11-29 19:22:00.182455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:12970366926827028479 len:9216 00:08:40.464 [2024-11-29 19:22:00.182471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.464 #60 NEW cov: 12527 ft: 15401 corp: 30/714b lim: 50 exec/s: 60 rss: 73Mb L: 45/45 MS: 1 CopyPart- 00:08:40.464 [2024-11-29 19:22:00.242198] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069666242559 len:65536 00:08:40.464 [2024-11-29 19:22:00.242225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.464 [2024-11-29 19:22:00.242259] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:8029759805912776559 len:65536 00:08:40.464 [2024-11-29 19:22:00.242275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.464 #61 NEW cov: 12527 ft: 15432 corp: 31/734b lim: 50 exec/s: 61 rss: 73Mb L: 20/45 MS: 1 CrossOver- 00:08:40.464 [2024-11-29 19:22:00.302349] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069666242559 len:65536 00:08:40.464 [2024-11-29 19:22:00.302377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.464 [2024-11-29 19:22:00.302426] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551395 len:65536 00:08:40.464 [2024-11-29 19:22:00.302442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.464 #62 NEW cov: 12527 ft: 15449 corp: 32/754b lim: 50 exec/s: 62 rss: 73Mb L: 20/45 MS: 1 CopyPart- 00:08:40.464 [2024-11-29 19:22:00.342353] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446463831640113151 len:65536 00:08:40.464 [2024-11-29 19:22:00.342381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.723 #63 NEW cov: 12527 ft: 15467 corp: 33/772b lim: 50 exec/s: 63 rss: 73Mb L: 18/45 MS: 1 CrossOver- 00:08:40.723 [2024-11-29 19:22:00.402611] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446558255321710591 len:65536 00:08:40.723 [2024-11-29 19:22:00.402654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.723 [2024-11-29 19:22:00.402724] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446462603027808035 len:1 00:08:40.723 [2024-11-29 19:22:00.402740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.723 #64 NEW cov: 12527 ft: 15483 corp: 34/792b lim: 50 exec/s: 64 rss: 73Mb L: 20/45 MS: 1 ChangeByte- 00:08:40.723 [2024-11-29 19:22:00.442727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069666242559 len:65536 00:08:40.723 [2024-11-29 19:22:00.442755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.723 [2024-11-29 19:22:00.442786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:8029759805912776559 len:65536 00:08:40.723 [2024-11-29 19:22:00.442802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.723 #65 NEW cov: 12527 ft: 15497 corp: 35/816b lim: 50 exec/s: 65 rss: 73Mb L: 24/45 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:40.723 [2024-11-29 19:22:00.502893] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069666242559 len:65536 00:08:40.723 [2024-11-29 19:22:00.502920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.723 [2024-11-29 19:22:00.502954] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9404222471055278978 len:33411 00:08:40.723 [2024-11-29 19:22:00.502970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.723 #66 NEW cov: 12527 ft: 15529 corp: 36/844b lim: 50 exec/s: 33 rss: 73Mb L: 28/45 MS: 1 ChangeBinInt- 00:08:40.723 #66 DONE cov: 12527 ft: 15529 corp: 36/844b lim: 50 exec/s: 33 rss: 73Mb 00:08:40.723 ###### Recommended dictionary. ###### 00:08:40.723 "\000\000\000\000" # Uses: 8 00:08:40.723 "\000\000\000\000\000\000\004\000" # Uses: 0 00:08:40.723 "\364\377\377\377" # Uses: 0 00:08:40.723 "\001\037" # Uses: 0 00:08:40.723 ###### End of recommended dictionary. ###### 00:08:40.723 Done 66 runs in 2 second(s) 00:08:40.723 19:22:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_19.conf /var/tmp/suppress_nvmf_fuzz 00:08:40.723 19:22:00 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:40.723 19:22:00 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:40.723 19:22:00 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:40.723 19:22:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:40.723 19:22:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:40.723 19:22:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:40.723 19:22:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:40.724 19:22:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:40.724 19:22:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:40.724 19:22:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:40.982 19:22:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 20 00:08:40.982 19:22:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4420 00:08:40.982 19:22:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:40.982 19:22:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:40.982 19:22:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:40.982 19:22:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:40.983 19:22:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:40.983 19:22:00 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 00:08:40.983 [2024-11-29 19:22:00.665360] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:40.983 [2024-11-29 19:22:00.665430] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1754722 ] 00:08:40.983 [2024-11-29 19:22:00.852721] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:40.983 [2024-11-29 19:22:00.865375] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:41.241 [2024-11-29 19:22:00.918551] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:41.241 [2024-11-29 19:22:00.934949] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:41.241 INFO: Running with entropic power schedule (0xFF, 100). 00:08:41.241 INFO: Seed: 15588995 00:08:41.241 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:08:41.241 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:08:41.241 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:41.241 INFO: A corpus is not provided, starting from an empty corpus 00:08:41.241 #2 INITED exec/s: 0 rss: 65Mb 00:08:41.241 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:41.241 This may also happen if the target rejected all inputs we tried so far 00:08:41.241 [2024-11-29 19:22:00.984235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.241 [2024-11-29 19:22:00.984268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.241 [2024-11-29 19:22:00.984322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:41.241 [2024-11-29 19:22:00.984338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.499 NEW_FUNC[1/717]: 0x47b978 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:41.499 NEW_FUNC[2/717]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:41.499 #5 NEW cov: 12357 ft: 12350 corp: 2/46b lim: 90 exec/s: 0 rss: 72Mb L: 45/45 MS: 3 InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:08:41.499 [2024-11-29 19:22:01.315111] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.499 [2024-11-29 19:22:01.315144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.499 [2024-11-29 19:22:01.315204] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:41.499 [2024-11-29 19:22:01.315221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.499 NEW_FUNC[1/1]: 0x1011a78 in rte_rdtsc /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/rte_cycles.h:31 00:08:41.499 #15 NEW cov: 12471 ft: 13019 corp: 3/97b lim: 90 exec/s: 0 rss: 72Mb L: 51/51 MS: 5 ChangeByte-InsertByte-ShuffleBytes-InsertRepeatedBytes-CrossOver- 00:08:41.499 [2024-11-29 19:22:01.354984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.499 [2024-11-29 19:22:01.355013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.499 #19 NEW cov: 12477 ft: 14073 corp: 4/124b lim: 90 exec/s: 0 rss: 72Mb L: 27/51 MS: 4 InsertRepeatedBytes-CrossOver-ChangeBit-CrossOver- 00:08:41.499 [2024-11-29 19:22:01.395235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.499 [2024-11-29 19:22:01.395263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.499 [2024-11-29 19:22:01.395331] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:41.499 [2024-11-29 19:22:01.395346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.757 #20 NEW cov: 12562 ft: 14363 corp: 5/175b lim: 90 exec/s: 0 rss: 72Mb L: 51/51 MS: 1 ShuffleBytes- 00:08:41.757 [2024-11-29 19:22:01.455279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.757 [2024-11-29 19:22:01.455307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.757 #21 NEW cov: 12562 ft: 14455 corp: 6/207b lim: 90 exec/s: 0 rss: 72Mb L: 32/51 MS: 1 EraseBytes- 00:08:41.757 [2024-11-29 19:22:01.515450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.757 [2024-11-29 19:22:01.515476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.758 #22 NEW cov: 12562 ft: 14489 corp: 7/239b lim: 90 exec/s: 0 rss: 73Mb L: 32/51 MS: 1 ChangeByte- 00:08:41.758 [2024-11-29 19:22:01.575792] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.758 [2024-11-29 19:22:01.575822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.758 [2024-11-29 19:22:01.575880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:41.758 [2024-11-29 19:22:01.575898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.758 #23 NEW cov: 12562 ft: 14602 corp: 8/291b lim: 90 exec/s: 0 rss: 73Mb L: 52/52 MS: 1 InsertByte- 00:08:41.758 [2024-11-29 19:22:01.616225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.758 [2024-11-29 19:22:01.616254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.758 [2024-11-29 19:22:01.616305] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:41.758 [2024-11-29 19:22:01.616322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.758 [2024-11-29 19:22:01.616378] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:41.758 [2024-11-29 19:22:01.616394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.758 [2024-11-29 19:22:01.616453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:41.758 [2024-11-29 19:22:01.616469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.758 #24 NEW cov: 12562 ft: 15038 corp: 9/365b lim: 90 exec/s: 0 rss: 73Mb L: 74/74 MS: 1 CrossOver- 00:08:42.016 [2024-11-29 19:22:01.675900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.016 [2024-11-29 19:22:01.675930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.016 #25 NEW cov: 12562 ft: 15112 corp: 10/390b lim: 90 exec/s: 0 rss: 73Mb L: 25/74 MS: 1 EraseBytes- 00:08:42.016 [2024-11-29 19:22:01.716037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.016 [2024-11-29 19:22:01.716067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.016 #26 NEW cov: 12562 ft: 15184 corp: 11/415b lim: 90 exec/s: 0 rss: 73Mb L: 25/74 MS: 1 ChangeBit- 00:08:42.016 [2024-11-29 19:22:01.776690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.016 [2024-11-29 19:22:01.776719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.016 [2024-11-29 19:22:01.776797] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:42.016 [2024-11-29 19:22:01.776814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.016 [2024-11-29 19:22:01.776871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:42.016 [2024-11-29 19:22:01.776887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.016 [2024-11-29 19:22:01.776945] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:42.016 [2024-11-29 19:22:01.776961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.016 #27 NEW cov: 12562 ft: 15204 corp: 12/489b lim: 90 exec/s: 0 rss: 73Mb L: 74/74 MS: 1 ShuffleBytes- 00:08:42.016 [2024-11-29 19:22:01.836863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.016 [2024-11-29 19:22:01.836891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.016 [2024-11-29 19:22:01.836938] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:42.016 [2024-11-29 19:22:01.836955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.016 [2024-11-29 19:22:01.837010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:42.016 [2024-11-29 19:22:01.837026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.016 [2024-11-29 19:22:01.837083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:42.016 [2024-11-29 19:22:01.837097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.016 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:42.016 #28 NEW cov: 12585 ft: 15244 corp: 13/563b lim: 90 exec/s: 0 rss: 73Mb L: 74/74 MS: 1 ChangeBinInt- 00:08:42.016 [2024-11-29 19:22:01.896676] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.016 [2024-11-29 19:22:01.896704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.016 [2024-11-29 19:22:01.896770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:42.016 [2024-11-29 19:22:01.896787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.016 #29 NEW cov: 12585 ft: 15269 corp: 14/604b lim: 90 exec/s: 0 rss: 73Mb L: 41/74 MS: 1 InsertRepeatedBytes- 00:08:42.275 [2024-11-29 19:22:01.936789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.275 [2024-11-29 19:22:01.936818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.275 [2024-11-29 19:22:01.936886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:42.275 [2024-11-29 19:22:01.936906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.275 #30 NEW cov: 12585 ft: 15295 corp: 15/655b lim: 90 exec/s: 30 rss: 73Mb L: 51/74 MS: 1 ShuffleBytes- 00:08:42.275 [2024-11-29 19:22:01.996946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.275 [2024-11-29 19:22:01.996974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.275 [2024-11-29 19:22:01.997011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:42.275 [2024-11-29 19:22:01.997027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.275 #31 NEW cov: 12585 ft: 15324 corp: 16/706b lim: 90 exec/s: 31 rss: 73Mb L: 51/74 MS: 1 ChangeBit- 00:08:42.275 [2024-11-29 19:22:02.056966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.275 [2024-11-29 19:22:02.056994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.275 #32 NEW cov: 12585 ft: 15333 corp: 17/731b lim: 90 exec/s: 32 rss: 73Mb L: 25/74 MS: 1 ChangeByte- 00:08:42.275 [2024-11-29 19:22:02.097214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.275 [2024-11-29 19:22:02.097242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.275 [2024-11-29 19:22:02.097289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:42.275 [2024-11-29 19:22:02.097306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.275 #33 NEW cov: 12585 ft: 15369 corp: 18/769b lim: 90 exec/s: 33 rss: 73Mb L: 38/74 MS: 1 EraseBytes- 00:08:42.275 [2024-11-29 19:22:02.157390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.275 [2024-11-29 19:22:02.157418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.275 [2024-11-29 19:22:02.157484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:42.275 [2024-11-29 19:22:02.157501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.275 #34 NEW cov: 12585 ft: 15386 corp: 19/822b lim: 90 exec/s: 34 rss: 73Mb L: 53/74 MS: 1 CrossOver- 00:08:42.534 [2024-11-29 19:22:02.197484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.534 [2024-11-29 19:22:02.197512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.534 [2024-11-29 19:22:02.197579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:42.534 [2024-11-29 19:22:02.197596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.534 #35 NEW cov: 12585 ft: 15410 corp: 20/864b lim: 90 exec/s: 35 rss: 74Mb L: 42/74 MS: 1 InsertByte- 00:08:42.534 [2024-11-29 19:22:02.257640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.534 [2024-11-29 19:22:02.257667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.534 [2024-11-29 19:22:02.257703] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:42.534 [2024-11-29 19:22:02.257719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.534 #36 NEW cov: 12585 ft: 15442 corp: 21/909b lim: 90 exec/s: 36 rss: 74Mb L: 45/74 MS: 1 ChangeASCIIInt- 00:08:42.534 [2024-11-29 19:22:02.297748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.534 [2024-11-29 19:22:02.297776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.534 [2024-11-29 19:22:02.297830] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:42.534 [2024-11-29 19:22:02.297847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.534 #37 NEW cov: 12585 ft: 15481 corp: 22/950b lim: 90 exec/s: 37 rss: 74Mb L: 41/74 MS: 1 EraseBytes- 00:08:42.534 [2024-11-29 19:22:02.337921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.534 [2024-11-29 19:22:02.337948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.534 [2024-11-29 19:22:02.337999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:42.534 [2024-11-29 19:22:02.338015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.534 #38 NEW cov: 12585 ft: 15489 corp: 23/992b lim: 90 exec/s: 38 rss: 74Mb L: 42/74 MS: 1 CrossOver- 00:08:42.534 [2024-11-29 19:22:02.378011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.534 [2024-11-29 19:22:02.378038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.534 [2024-11-29 19:22:02.378073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:42.534 [2024-11-29 19:22:02.378089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.534 #39 NEW cov: 12585 ft: 15505 corp: 24/1034b lim: 90 exec/s: 39 rss: 74Mb L: 42/74 MS: 1 ChangeBinInt- 00:08:42.534 [2024-11-29 19:22:02.438571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.534 [2024-11-29 19:22:02.438607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.534 [2024-11-29 19:22:02.438647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:42.534 [2024-11-29 19:22:02.438665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.534 [2024-11-29 19:22:02.438722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:42.534 [2024-11-29 19:22:02.438737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.534 [2024-11-29 19:22:02.438801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:42.534 [2024-11-29 19:22:02.438820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.797 #40 NEW cov: 12585 ft: 15526 corp: 25/1109b lim: 90 exec/s: 40 rss: 74Mb L: 75/75 MS: 1 InsertByte- 00:08:42.797 [2024-11-29 19:22:02.498362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.797 [2024-11-29 19:22:02.498390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.797 [2024-11-29 19:22:02.498458] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:42.797 [2024-11-29 19:22:02.498474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.797 #41 NEW cov: 12585 ft: 15534 corp: 26/1161b lim: 90 exec/s: 41 rss: 74Mb L: 52/75 MS: 1 ChangeBinInt- 00:08:42.797 [2024-11-29 19:22:02.538453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.797 [2024-11-29 19:22:02.538481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.797 [2024-11-29 19:22:02.538539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:42.797 [2024-11-29 19:22:02.538555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.797 #42 NEW cov: 12585 ft: 15553 corp: 27/1212b lim: 90 exec/s: 42 rss: 74Mb L: 51/75 MS: 1 ChangeBinInt- 00:08:42.797 [2024-11-29 19:22:02.578629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.797 [2024-11-29 19:22:02.578657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.797 [2024-11-29 19:22:02.578711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:42.797 [2024-11-29 19:22:02.578728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.797 #43 NEW cov: 12585 ft: 15594 corp: 28/1254b lim: 90 exec/s: 43 rss: 74Mb L: 42/75 MS: 1 CrossOver- 00:08:42.797 [2024-11-29 19:22:02.638596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.797 [2024-11-29 19:22:02.638630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.797 #44 NEW cov: 12585 ft: 15622 corp: 29/1286b lim: 90 exec/s: 44 rss: 74Mb L: 32/75 MS: 1 ChangeByte- 00:08:42.797 [2024-11-29 19:22:02.678734] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.797 [2024-11-29 19:22:02.678764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.105 #45 NEW cov: 12585 ft: 15632 corp: 30/1310b lim: 90 exec/s: 45 rss: 74Mb L: 24/75 MS: 1 CrossOver- 00:08:43.105 [2024-11-29 19:22:02.748889] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.105 [2024-11-29 19:22:02.748917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.105 #46 NEW cov: 12585 ft: 15685 corp: 31/1335b lim: 90 exec/s: 46 rss: 74Mb L: 25/75 MS: 1 ChangeBit- 00:08:43.105 [2024-11-29 19:22:02.809432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.105 [2024-11-29 19:22:02.809459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.105 [2024-11-29 19:22:02.809523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.105 [2024-11-29 19:22:02.809540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.105 [2024-11-29 19:22:02.809602] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:43.105 [2024-11-29 19:22:02.809619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.105 #47 NEW cov: 12585 ft: 15953 corp: 32/1397b lim: 90 exec/s: 47 rss: 74Mb L: 62/75 MS: 1 CrossOver- 00:08:43.105 [2024-11-29 19:22:02.869415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.105 [2024-11-29 19:22:02.869442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.105 [2024-11-29 19:22:02.869498] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.105 [2024-11-29 19:22:02.869515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.105 #48 NEW cov: 12585 ft: 15994 corp: 33/1439b lim: 90 exec/s: 48 rss: 74Mb L: 42/75 MS: 1 ChangeASCIIInt- 00:08:43.105 [2024-11-29 19:22:02.909518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.105 [2024-11-29 19:22:02.909546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.105 [2024-11-29 19:22:02.909584] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.105 [2024-11-29 19:22:02.909602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.105 #49 NEW cov: 12585 ft: 15999 corp: 34/1480b lim: 90 exec/s: 49 rss: 74Mb L: 41/75 MS: 1 ChangeBit- 00:08:43.105 [2024-11-29 19:22:02.949618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.105 [2024-11-29 19:22:02.949645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.105 [2024-11-29 19:22:02.949699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.105 [2024-11-29 19:22:02.949716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.105 #53 NEW cov: 12585 ft: 16002 corp: 35/1526b lim: 90 exec/s: 26 rss: 74Mb L: 46/75 MS: 4 CopyPart-CrossOver-CopyPart-CrossOver- 00:08:43.105 #53 DONE cov: 12585 ft: 16002 corp: 35/1526b lim: 90 exec/s: 26 rss: 74Mb 00:08:43.105 Done 53 runs in 2 second(s) 00:08:43.427 19:22:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_20.conf /var/tmp/suppress_nvmf_fuzz 00:08:43.427 19:22:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:43.427 19:22:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:43.427 19:22:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:43.427 19:22:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:43.427 19:22:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:43.427 19:22:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:43.427 19:22:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:43.427 19:22:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:43.427 19:22:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:43.427 19:22:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:43.427 19:22:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 21 00:08:43.427 19:22:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4421 00:08:43.427 19:22:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:43.427 19:22:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:43.427 19:22:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:43.427 19:22:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:43.427 19:22:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:43.427 19:22:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 00:08:43.427 [2024-11-29 19:22:03.114393] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:43.427 [2024-11-29 19:22:03.114475] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1755010 ] 00:08:43.427 [2024-11-29 19:22:03.299225] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:43.427 [2024-11-29 19:22:03.312801] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.685 [2024-11-29 19:22:03.365749] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:43.685 [2024-11-29 19:22:03.382137] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:43.685 INFO: Running with entropic power schedule (0xFF, 100). 00:08:43.685 INFO: Seed: 2464588637 00:08:43.685 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:08:43.685 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:08:43.685 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:43.685 INFO: A corpus is not provided, starting from an empty corpus 00:08:43.685 #2 INITED exec/s: 0 rss: 64Mb 00:08:43.685 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:43.685 This may also happen if the target rejected all inputs we tried so far 00:08:43.686 [2024-11-29 19:22:03.458311] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:43.686 [2024-11-29 19:22:03.458351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.686 [2024-11-29 19:22:03.458478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:43.686 [2024-11-29 19:22:03.458499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.944 NEW_FUNC[1/718]: 0x47eba8 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:43.944 NEW_FUNC[2/718]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:43.944 #3 NEW cov: 12315 ft: 12316 corp: 2/22b lim: 50 exec/s: 0 rss: 72Mb L: 21/21 MS: 1 InsertRepeatedBytes- 00:08:43.944 [2024-11-29 19:22:03.798879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:43.944 [2024-11-29 19:22:03.798921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.944 #4 NEW cov: 12445 ft: 13415 corp: 3/35b lim: 50 exec/s: 0 rss: 72Mb L: 13/21 MS: 1 CrossOver- 00:08:43.944 [2024-11-29 19:22:03.849085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:43.944 [2024-11-29 19:22:03.849118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.203 #5 NEW cov: 12451 ft: 13713 corp: 4/48b lim: 50 exec/s: 0 rss: 72Mb L: 13/21 MS: 1 CrossOver- 00:08:44.203 [2024-11-29 19:22:03.919842] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.203 [2024-11-29 19:22:03.919875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.203 [2024-11-29 19:22:03.919959] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.203 [2024-11-29 19:22:03.919982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.203 [2024-11-29 19:22:03.920095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:44.203 [2024-11-29 19:22:03.920118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.203 [2024-11-29 19:22:03.920252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:44.203 [2024-11-29 19:22:03.920272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.203 #6 NEW cov: 12536 ft: 14497 corp: 5/92b lim: 50 exec/s: 0 rss: 72Mb L: 44/44 MS: 1 InsertRepeatedBytes- 00:08:44.203 [2024-11-29 19:22:03.969622] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.203 [2024-11-29 19:22:03.969652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.203 [2024-11-29 19:22:03.969771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.203 [2024-11-29 19:22:03.969795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.203 #7 NEW cov: 12536 ft: 14601 corp: 6/113b lim: 50 exec/s: 0 rss: 72Mb L: 21/44 MS: 1 ChangeByte- 00:08:44.203 [2024-11-29 19:22:04.040021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.203 [2024-11-29 19:22:04.040057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.203 [2024-11-29 19:22:04.040177] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.203 [2024-11-29 19:22:04.040202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.203 [2024-11-29 19:22:04.040327] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:44.204 [2024-11-29 19:22:04.040350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.204 #8 NEW cov: 12536 ft: 14928 corp: 7/143b lim: 50 exec/s: 0 rss: 72Mb L: 30/44 MS: 1 InsertRepeatedBytes- 00:08:44.204 [2024-11-29 19:22:04.089855] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.204 [2024-11-29 19:22:04.089890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.204 [2024-11-29 19:22:04.090016] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.204 [2024-11-29 19:22:04.090038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.462 #9 NEW cov: 12536 ft: 14972 corp: 8/165b lim: 50 exec/s: 0 rss: 72Mb L: 22/44 MS: 1 InsertByte- 00:08:44.462 [2024-11-29 19:22:04.159877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.462 [2024-11-29 19:22:04.159903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.462 #10 NEW cov: 12536 ft: 15028 corp: 9/178b lim: 50 exec/s: 0 rss: 73Mb L: 13/44 MS: 1 ChangeBinInt- 00:08:44.462 [2024-11-29 19:22:04.230790] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.462 [2024-11-29 19:22:04.230822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.462 [2024-11-29 19:22:04.230905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.462 [2024-11-29 19:22:04.230927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.462 [2024-11-29 19:22:04.231049] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:44.462 [2024-11-29 19:22:04.231071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.462 [2024-11-29 19:22:04.231200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:44.462 [2024-11-29 19:22:04.231222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.462 #11 NEW cov: 12536 ft: 15052 corp: 10/221b lim: 50 exec/s: 0 rss: 73Mb L: 43/44 MS: 1 InsertRepeatedBytes- 00:08:44.462 [2024-11-29 19:22:04.280461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.462 [2024-11-29 19:22:04.280494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.463 [2024-11-29 19:22:04.280611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.463 [2024-11-29 19:22:04.280637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.463 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:44.463 #12 NEW cov: 12559 ft: 15132 corp: 11/243b lim: 50 exec/s: 0 rss: 73Mb L: 22/44 MS: 1 CopyPart- 00:08:44.463 [2024-11-29 19:22:04.350393] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.463 [2024-11-29 19:22:04.350419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.721 #13 NEW cov: 12559 ft: 15159 corp: 12/259b lim: 50 exec/s: 0 rss: 73Mb L: 16/44 MS: 1 EraseBytes- 00:08:44.721 [2024-11-29 19:22:04.400856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.721 [2024-11-29 19:22:04.400887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.721 [2024-11-29 19:22:04.401012] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.722 [2024-11-29 19:22:04.401039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.722 #14 NEW cov: 12559 ft: 15269 corp: 13/280b lim: 50 exec/s: 0 rss: 73Mb L: 21/44 MS: 1 ChangeBinInt- 00:08:44.722 [2024-11-29 19:22:04.451064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.722 [2024-11-29 19:22:04.451099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.722 [2024-11-29 19:22:04.451239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.722 [2024-11-29 19:22:04.451260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.722 #15 NEW cov: 12559 ft: 15353 corp: 14/301b lim: 50 exec/s: 15 rss: 73Mb L: 21/44 MS: 1 ChangeBinInt- 00:08:44.722 [2024-11-29 19:22:04.501415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.722 [2024-11-29 19:22:04.501445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.722 [2024-11-29 19:22:04.501541] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.722 [2024-11-29 19:22:04.501564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.722 [2024-11-29 19:22:04.501690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:44.722 [2024-11-29 19:22:04.501711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.722 #16 NEW cov: 12559 ft: 15378 corp: 15/338b lim: 50 exec/s: 16 rss: 73Mb L: 37/44 MS: 1 EraseBytes- 00:08:44.722 [2024-11-29 19:22:04.571125] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.722 [2024-11-29 19:22:04.571152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.722 #20 NEW cov: 12559 ft: 15399 corp: 16/356b lim: 50 exec/s: 20 rss: 73Mb L: 18/44 MS: 4 ChangeBit-InsertByte-ChangeBinInt-CrossOver- 00:08:44.722 [2024-11-29 19:22:04.622021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.722 [2024-11-29 19:22:04.622054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.722 [2024-11-29 19:22:04.622164] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.722 [2024-11-29 19:22:04.622189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.722 [2024-11-29 19:22:04.622315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:44.722 [2024-11-29 19:22:04.622354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.722 [2024-11-29 19:22:04.622477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:44.722 [2024-11-29 19:22:04.622503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.981 #21 NEW cov: 12559 ft: 15421 corp: 17/400b lim: 50 exec/s: 21 rss: 73Mb L: 44/44 MS: 1 InsertByte- 00:08:44.981 [2024-11-29 19:22:04.671962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.981 [2024-11-29 19:22:04.671996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.981 [2024-11-29 19:22:04.672134] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.981 [2024-11-29 19:22:04.672158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.981 [2024-11-29 19:22:04.672284] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:44.981 [2024-11-29 19:22:04.672309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.981 #22 NEW cov: 12559 ft: 15458 corp: 18/430b lim: 50 exec/s: 22 rss: 73Mb L: 30/44 MS: 1 ChangeByte- 00:08:44.981 [2024-11-29 19:22:04.741916] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.981 [2024-11-29 19:22:04.741951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.981 [2024-11-29 19:22:04.742084] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.981 [2024-11-29 19:22:04.742103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.981 #23 NEW cov: 12559 ft: 15484 corp: 19/452b lim: 50 exec/s: 23 rss: 73Mb L: 22/44 MS: 1 InsertByte- 00:08:44.981 [2024-11-29 19:22:04.791752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.981 [2024-11-29 19:22:04.791779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.981 #24 NEW cov: 12559 ft: 15522 corp: 20/471b lim: 50 exec/s: 24 rss: 73Mb L: 19/44 MS: 1 CopyPart- 00:08:44.981 [2024-11-29 19:22:04.862420] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.981 [2024-11-29 19:22:04.862456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.981 [2024-11-29 19:22:04.862574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.981 [2024-11-29 19:22:04.862601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.981 [2024-11-29 19:22:04.862733] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:44.981 [2024-11-29 19:22:04.862761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.981 #25 NEW cov: 12559 ft: 15540 corp: 21/501b lim: 50 exec/s: 25 rss: 73Mb L: 30/44 MS: 1 ShuffleBytes- 00:08:45.240 [2024-11-29 19:22:04.912938] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.240 [2024-11-29 19:22:04.912972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.240 [2024-11-29 19:22:04.913081] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.240 [2024-11-29 19:22:04.913107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.240 [2024-11-29 19:22:04.913228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:45.240 [2024-11-29 19:22:04.913253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.240 [2024-11-29 19:22:04.913377] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:45.240 [2024-11-29 19:22:04.913401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.240 #26 NEW cov: 12559 ft: 15578 corp: 22/548b lim: 50 exec/s: 26 rss: 73Mb L: 47/47 MS: 1 CrossOver- 00:08:45.240 [2024-11-29 19:22:04.963097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.240 [2024-11-29 19:22:04.963132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.240 [2024-11-29 19:22:04.963240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.240 [2024-11-29 19:22:04.963262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.240 [2024-11-29 19:22:04.963389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:45.240 [2024-11-29 19:22:04.963415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.240 [2024-11-29 19:22:04.963542] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:45.240 [2024-11-29 19:22:04.963566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.240 #27 NEW cov: 12559 ft: 15600 corp: 23/597b lim: 50 exec/s: 27 rss: 73Mb L: 49/49 MS: 1 InsertRepeatedBytes- 00:08:45.240 [2024-11-29 19:22:05.032993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.240 [2024-11-29 19:22:05.033029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.240 [2024-11-29 19:22:05.033134] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.240 [2024-11-29 19:22:05.033154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.240 [2024-11-29 19:22:05.033281] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:45.240 [2024-11-29 19:22:05.033308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.240 #28 NEW cov: 12559 ft: 15609 corp: 24/635b lim: 50 exec/s: 28 rss: 73Mb L: 38/49 MS: 1 InsertByte- 00:08:45.240 [2024-11-29 19:22:05.102917] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.240 [2024-11-29 19:22:05.102948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.240 [2024-11-29 19:22:05.103076] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.240 [2024-11-29 19:22:05.103099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.240 #29 NEW cov: 12559 ft: 15627 corp: 25/656b lim: 50 exec/s: 29 rss: 74Mb L: 21/49 MS: 1 CrossOver- 00:08:45.498 [2024-11-29 19:22:05.172931] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.498 [2024-11-29 19:22:05.172961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.498 #30 NEW cov: 12559 ft: 15648 corp: 26/673b lim: 50 exec/s: 30 rss: 74Mb L: 17/49 MS: 1 EraseBytes- 00:08:45.498 [2024-11-29 19:22:05.243102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.498 [2024-11-29 19:22:05.243127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.498 #31 NEW cov: 12559 ft: 15670 corp: 27/690b lim: 50 exec/s: 31 rss: 74Mb L: 17/49 MS: 1 InsertByte- 00:08:45.498 [2024-11-29 19:22:05.314059] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.498 [2024-11-29 19:22:05.314090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.498 [2024-11-29 19:22:05.314186] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.498 [2024-11-29 19:22:05.314206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.498 [2024-11-29 19:22:05.314330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:45.498 [2024-11-29 19:22:05.314349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.498 [2024-11-29 19:22:05.314477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:45.498 [2024-11-29 19:22:05.314504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.498 #32 NEW cov: 12559 ft: 15720 corp: 28/735b lim: 50 exec/s: 32 rss: 74Mb L: 45/49 MS: 1 InsertByte- 00:08:45.498 [2024-11-29 19:22:05.383518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.498 [2024-11-29 19:22:05.383544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.757 #33 NEW cov: 12559 ft: 15733 corp: 29/751b lim: 50 exec/s: 33 rss: 74Mb L: 16/49 MS: 1 CrossOver- 00:08:45.757 [2024-11-29 19:22:05.434106] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.757 [2024-11-29 19:22:05.434140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.757 [2024-11-29 19:22:05.434264] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.757 [2024-11-29 19:22:05.434286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.757 [2024-11-29 19:22:05.434422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:45.757 [2024-11-29 19:22:05.434450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.757 #34 NEW cov: 12559 ft: 15738 corp: 30/781b lim: 50 exec/s: 17 rss: 74Mb L: 30/49 MS: 1 CopyPart- 00:08:45.757 #34 DONE cov: 12559 ft: 15738 corp: 30/781b lim: 50 exec/s: 17 rss: 74Mb 00:08:45.757 Done 34 runs in 2 second(s) 00:08:45.757 19:22:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_21.conf /var/tmp/suppress_nvmf_fuzz 00:08:45.757 19:22:05 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:45.757 19:22:05 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:45.757 19:22:05 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:45.757 19:22:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:45.757 19:22:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:45.757 19:22:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:45.757 19:22:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:45.757 19:22:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:45.757 19:22:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:45.757 19:22:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:45.757 19:22:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 22 00:08:45.757 19:22:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4422 00:08:45.757 19:22:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:45.757 19:22:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:45.757 19:22:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:45.757 19:22:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:45.757 19:22:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:45.757 19:22:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 00:08:45.758 [2024-11-29 19:22:05.620030] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:45.758 [2024-11-29 19:22:05.620101] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1755538 ] 00:08:46.017 [2024-11-29 19:22:05.802955] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:46.017 [2024-11-29 19:22:05.816648] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:46.017 [2024-11-29 19:22:05.869554] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:46.017 [2024-11-29 19:22:05.885887] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:46.017 INFO: Running with entropic power schedule (0xFF, 100). 00:08:46.017 INFO: Seed: 673622008 00:08:46.017 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:08:46.017 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:08:46.017 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:46.017 INFO: A corpus is not provided, starting from an empty corpus 00:08:46.017 #2 INITED exec/s: 0 rss: 64Mb 00:08:46.017 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:46.017 This may also happen if the target rejected all inputs we tried so far 00:08:46.276 [2024-11-29 19:22:05.930771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.276 [2024-11-29 19:22:05.930803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.276 [2024-11-29 19:22:05.930858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:46.276 [2024-11-29 19:22:05.930877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.276 [2024-11-29 19:22:05.930906] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:46.276 [2024-11-29 19:22:05.930922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.535 NEW_FUNC[1/718]: 0x480e78 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:46.535 NEW_FUNC[2/718]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:46.535 #11 NEW cov: 12359 ft: 12356 corp: 2/66b lim: 85 exec/s: 0 rss: 72Mb L: 65/65 MS: 4 ShuffleBytes-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:46.535 [2024-11-29 19:22:06.281669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.535 [2024-11-29 19:22:06.281707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.535 [2024-11-29 19:22:06.281757] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:46.535 [2024-11-29 19:22:06.281775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.535 [2024-11-29 19:22:06.281805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:46.535 [2024-11-29 19:22:06.281821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.535 [2024-11-29 19:22:06.281850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:46.535 [2024-11-29 19:22:06.281867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.535 #12 NEW cov: 12472 ft: 13374 corp: 3/135b lim: 85 exec/s: 0 rss: 73Mb L: 69/69 MS: 1 InsertRepeatedBytes- 00:08:46.535 [2024-11-29 19:22:06.371650] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.535 [2024-11-29 19:22:06.371681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.535 #14 NEW cov: 12478 ft: 14342 corp: 4/160b lim: 85 exec/s: 0 rss: 73Mb L: 25/69 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:46.535 [2024-11-29 19:22:06.431733] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.535 [2024-11-29 19:22:06.431763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.794 #15 NEW cov: 12563 ft: 14629 corp: 5/181b lim: 85 exec/s: 0 rss: 73Mb L: 21/69 MS: 1 EraseBytes- 00:08:46.794 [2024-11-29 19:22:06.522030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.794 [2024-11-29 19:22:06.522062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.794 #16 NEW cov: 12563 ft: 14736 corp: 6/202b lim: 85 exec/s: 0 rss: 73Mb L: 21/69 MS: 1 ShuffleBytes- 00:08:46.794 [2024-11-29 19:22:06.612271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.794 [2024-11-29 19:22:06.612302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.794 #17 NEW cov: 12563 ft: 14840 corp: 7/223b lim: 85 exec/s: 0 rss: 73Mb L: 21/69 MS: 1 ChangeBinInt- 00:08:47.053 [2024-11-29 19:22:06.702630] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.053 [2024-11-29 19:22:06.702665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.053 [2024-11-29 19:22:06.702716] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:47.053 [2024-11-29 19:22:06.702736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.053 #18 NEW cov: 12563 ft: 15203 corp: 8/265b lim: 85 exec/s: 0 rss: 73Mb L: 42/69 MS: 1 EraseBytes- 00:08:47.053 [2024-11-29 19:22:06.792831] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.053 [2024-11-29 19:22:06.792861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.053 [2024-11-29 19:22:06.792911] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:47.053 [2024-11-29 19:22:06.792929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.053 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:47.053 #20 NEW cov: 12580 ft: 15276 corp: 9/314b lim: 85 exec/s: 0 rss: 73Mb L: 49/69 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:47.053 [2024-11-29 19:22:06.852918] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.053 [2024-11-29 19:22:06.852948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.053 [2024-11-29 19:22:06.852982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:47.053 [2024-11-29 19:22:06.853000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.053 #21 NEW cov: 12580 ft: 15303 corp: 10/363b lim: 85 exec/s: 21 rss: 73Mb L: 49/69 MS: 1 ChangeByte- 00:08:47.053 [2024-11-29 19:22:06.943194] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.053 [2024-11-29 19:22:06.943225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.053 [2024-11-29 19:22:06.943259] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:47.053 [2024-11-29 19:22:06.943277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.312 #22 NEW cov: 12580 ft: 15328 corp: 11/412b lim: 85 exec/s: 22 rss: 73Mb L: 49/69 MS: 1 CopyPart- 00:08:47.312 [2024-11-29 19:22:07.003391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.312 [2024-11-29 19:22:07.003422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.312 [2024-11-29 19:22:07.003469] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:47.312 [2024-11-29 19:22:07.003486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.312 [2024-11-29 19:22:07.003516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:47.312 [2024-11-29 19:22:07.003532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.312 [2024-11-29 19:22:07.003561] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:47.312 [2024-11-29 19:22:07.003577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.312 #23 NEW cov: 12580 ft: 15356 corp: 12/481b lim: 85 exec/s: 23 rss: 73Mb L: 69/69 MS: 1 ChangeBit- 00:08:47.312 [2024-11-29 19:22:07.063500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.312 [2024-11-29 19:22:07.063533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.312 [2024-11-29 19:22:07.063581] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:47.312 [2024-11-29 19:22:07.063605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.312 [2024-11-29 19:22:07.063636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:47.312 [2024-11-29 19:22:07.063652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.312 #24 NEW cov: 12580 ft: 15370 corp: 13/541b lim: 85 exec/s: 24 rss: 73Mb L: 60/69 MS: 1 InsertRepeatedBytes- 00:08:47.312 [2024-11-29 19:22:07.153694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.312 [2024-11-29 19:22:07.153723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.312 #25 NEW cov: 12580 ft: 15456 corp: 14/562b lim: 85 exec/s: 25 rss: 73Mb L: 21/69 MS: 1 ChangeByte- 00:08:47.312 [2024-11-29 19:22:07.213831] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.312 [2024-11-29 19:22:07.213862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.571 #26 NEW cov: 12580 ft: 15531 corp: 15/594b lim: 85 exec/s: 26 rss: 74Mb L: 32/69 MS: 1 EraseBytes- 00:08:47.571 [2024-11-29 19:22:07.304194] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.571 [2024-11-29 19:22:07.304224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.571 [2024-11-29 19:22:07.304271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:47.571 [2024-11-29 19:22:07.304288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.571 [2024-11-29 19:22:07.304319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:47.571 [2024-11-29 19:22:07.304335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.572 [2024-11-29 19:22:07.304364] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:47.572 [2024-11-29 19:22:07.304380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.572 #27 NEW cov: 12580 ft: 15572 corp: 16/663b lim: 85 exec/s: 27 rss: 74Mb L: 69/69 MS: 1 ChangeBit- 00:08:47.572 [2024-11-29 19:22:07.354295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.572 [2024-11-29 19:22:07.354326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.572 [2024-11-29 19:22:07.354359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:47.572 [2024-11-29 19:22:07.354377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.572 [2024-11-29 19:22:07.354407] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:47.572 [2024-11-29 19:22:07.354423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.572 #28 NEW cov: 12580 ft: 15576 corp: 17/728b lim: 85 exec/s: 28 rss: 74Mb L: 65/69 MS: 1 ChangeBinInt- 00:08:47.572 [2024-11-29 19:22:07.414357] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.572 [2024-11-29 19:22:07.414390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.572 [2024-11-29 19:22:07.414440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:47.572 [2024-11-29 19:22:07.414457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.572 #29 NEW cov: 12580 ft: 15597 corp: 18/770b lim: 85 exec/s: 29 rss: 74Mb L: 42/69 MS: 1 ChangeByte- 00:08:47.572 [2024-11-29 19:22:07.474656] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.572 [2024-11-29 19:22:07.474687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.572 [2024-11-29 19:22:07.474720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:47.572 [2024-11-29 19:22:07.474738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.572 [2024-11-29 19:22:07.474770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:47.572 [2024-11-29 19:22:07.474787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.831 #30 NEW cov: 12580 ft: 15610 corp: 19/826b lim: 85 exec/s: 30 rss: 74Mb L: 56/69 MS: 1 EraseBytes- 00:08:47.831 [2024-11-29 19:22:07.564821] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.831 [2024-11-29 19:22:07.564851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.831 [2024-11-29 19:22:07.564884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:47.831 [2024-11-29 19:22:07.564902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.831 #31 NEW cov: 12580 ft: 15686 corp: 20/875b lim: 85 exec/s: 31 rss: 74Mb L: 49/69 MS: 1 ShuffleBytes- 00:08:47.831 [2024-11-29 19:22:07.614933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.831 [2024-11-29 19:22:07.614962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.831 [2024-11-29 19:22:07.615009] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:47.831 [2024-11-29 19:22:07.615026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.831 [2024-11-29 19:22:07.615057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:47.831 [2024-11-29 19:22:07.615072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.831 #32 NEW cov: 12580 ft: 15696 corp: 21/935b lim: 85 exec/s: 32 rss: 74Mb L: 60/69 MS: 1 CMP- DE: "-\"{\035X \224\000"- 00:08:47.831 [2024-11-29 19:22:07.705247] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.831 [2024-11-29 19:22:07.705277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.831 [2024-11-29 19:22:07.705309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:47.831 [2024-11-29 19:22:07.705326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.831 [2024-11-29 19:22:07.705357] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:47.831 [2024-11-29 19:22:07.705374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.090 #33 NEW cov: 12580 ft: 15730 corp: 22/992b lim: 85 exec/s: 33 rss: 74Mb L: 57/69 MS: 1 PersAutoDict- DE: "-\"{\035X \224\000"- 00:08:48.090 [2024-11-29 19:22:07.766118] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.090 [2024-11-29 19:22:07.766147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.090 [2024-11-29 19:22:07.766200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.090 [2024-11-29 19:22:07.766218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.090 [2024-11-29 19:22:07.766275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.090 [2024-11-29 19:22:07.766292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.090 #34 NEW cov: 12580 ft: 15912 corp: 23/1057b lim: 85 exec/s: 34 rss: 74Mb L: 65/69 MS: 1 ShuffleBytes- 00:08:48.090 [2024-11-29 19:22:07.806219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.090 [2024-11-29 19:22:07.806247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.090 [2024-11-29 19:22:07.806292] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.090 [2024-11-29 19:22:07.806309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.090 [2024-11-29 19:22:07.806365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.090 [2024-11-29 19:22:07.806380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.090 #35 NEW cov: 12587 ft: 15993 corp: 24/1122b lim: 85 exec/s: 35 rss: 74Mb L: 65/69 MS: 1 ShuffleBytes- 00:08:48.090 [2024-11-29 19:22:07.846508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.090 [2024-11-29 19:22:07.846536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.090 [2024-11-29 19:22:07.846574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.090 [2024-11-29 19:22:07.846591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.090 [2024-11-29 19:22:07.846669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.090 [2024-11-29 19:22:07.846686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.090 [2024-11-29 19:22:07.846743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:48.090 [2024-11-29 19:22:07.846759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.090 #36 NEW cov: 12587 ft: 16012 corp: 25/1191b lim: 85 exec/s: 36 rss: 74Mb L: 69/69 MS: 1 ShuffleBytes- 00:08:48.090 [2024-11-29 19:22:07.906199] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.090 [2024-11-29 19:22:07.906228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.090 #37 NEW cov: 12587 ft: 16018 corp: 26/1211b lim: 85 exec/s: 18 rss: 74Mb L: 20/69 MS: 1 EraseBytes- 00:08:48.090 #37 DONE cov: 12587 ft: 16018 corp: 26/1211b lim: 85 exec/s: 18 rss: 74Mb 00:08:48.090 ###### Recommended dictionary. ###### 00:08:48.090 "-\"{\035X \224\000" # Uses: 1 00:08:48.090 ###### End of recommended dictionary. ###### 00:08:48.090 Done 37 runs in 2 second(s) 00:08:48.350 19:22:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_22.conf /var/tmp/suppress_nvmf_fuzz 00:08:48.350 19:22:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:48.350 19:22:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:48.350 19:22:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:48.350 19:22:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:48.350 19:22:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:48.350 19:22:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:48.350 19:22:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:48.350 19:22:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:48.350 19:22:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:48.350 19:22:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:48.350 19:22:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 23 00:08:48.350 19:22:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4423 00:08:48.350 19:22:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:48.350 19:22:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:48.350 19:22:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:48.350 19:22:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:48.350 19:22:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:48.350 19:22:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 00:08:48.350 [2024-11-29 19:22:08.075897] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:48.350 [2024-11-29 19:22:08.075971] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1755956 ] 00:08:48.609 [2024-11-29 19:22:08.287984] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:48.609 [2024-11-29 19:22:08.301882] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:48.609 [2024-11-29 19:22:08.354781] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:48.609 [2024-11-29 19:22:08.371146] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:48.609 INFO: Running with entropic power schedule (0xFF, 100). 00:08:48.609 INFO: Seed: 3158612486 00:08:48.609 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:08:48.609 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:08:48.609 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:48.609 INFO: A corpus is not provided, starting from an empty corpus 00:08:48.609 #2 INITED exec/s: 0 rss: 63Mb 00:08:48.609 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:48.609 This may also happen if the target rejected all inputs we tried so far 00:08:48.609 [2024-11-29 19:22:08.415830] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:48.609 [2024-11-29 19:22:08.415865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.868 NEW_FUNC[1/717]: 0x4840b8 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:48.868 NEW_FUNC[2/717]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:48.868 #7 NEW cov: 12291 ft: 12289 corp: 2/7b lim: 25 exec/s: 0 rss: 71Mb L: 6/6 MS: 5 ShuffleBytes-InsertByte-ShuffleBytes-CopyPart-CMP- DE: "\377\377"- 00:08:48.868 [2024-11-29 19:22:08.757188] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:48.868 [2024-11-29 19:22:08.757230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.127 #8 NEW cov: 12405 ft: 12939 corp: 3/13b lim: 25 exec/s: 0 rss: 71Mb L: 6/6 MS: 1 ChangeByte- 00:08:49.127 [2024-11-29 19:22:08.817226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.127 [2024-11-29 19:22:08.817252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.127 #11 NEW cov: 12411 ft: 13161 corp: 4/18b lim: 25 exec/s: 0 rss: 71Mb L: 5/6 MS: 3 EraseBytes-ChangeByte-CopyPart- 00:08:49.127 [2024-11-29 19:22:08.877376] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.127 [2024-11-29 19:22:08.877404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.127 #12 NEW cov: 12496 ft: 13414 corp: 5/25b lim: 25 exec/s: 0 rss: 71Mb L: 7/7 MS: 1 CrossOver- 00:08:49.127 [2024-11-29 19:22:08.917571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.127 [2024-11-29 19:22:08.917603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.127 [2024-11-29 19:22:08.917640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.127 [2024-11-29 19:22:08.917657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.127 #13 NEW cov: 12496 ft: 13828 corp: 6/37b lim: 25 exec/s: 0 rss: 71Mb L: 12/12 MS: 1 InsertRepeatedBytes- 00:08:49.127 [2024-11-29 19:22:08.957694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.127 [2024-11-29 19:22:08.957720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.127 [2024-11-29 19:22:08.957775] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.127 [2024-11-29 19:22:08.957798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.127 #14 NEW cov: 12496 ft: 13980 corp: 7/49b lim: 25 exec/s: 0 rss: 71Mb L: 12/12 MS: 1 ChangeByte- 00:08:49.127 [2024-11-29 19:22:09.018112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.127 [2024-11-29 19:22:09.018138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.127 [2024-11-29 19:22:09.018207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.127 [2024-11-29 19:22:09.018223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.127 [2024-11-29 19:22:09.018276] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.127 [2024-11-29 19:22:09.018291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.127 [2024-11-29 19:22:09.018345] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:49.127 [2024-11-29 19:22:09.018361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.387 #15 NEW cov: 12496 ft: 14589 corp: 8/70b lim: 25 exec/s: 0 rss: 71Mb L: 21/21 MS: 1 InsertRepeatedBytes- 00:08:49.387 [2024-11-29 19:22:09.058092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.387 [2024-11-29 19:22:09.058119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.387 [2024-11-29 19:22:09.058165] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.387 [2024-11-29 19:22:09.058187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.387 [2024-11-29 19:22:09.058241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.387 [2024-11-29 19:22:09.058255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.387 #16 NEW cov: 12496 ft: 14823 corp: 9/87b lim: 25 exec/s: 0 rss: 71Mb L: 17/21 MS: 1 InsertRepeatedBytes- 00:08:49.387 [2024-11-29 19:22:09.098099] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.387 [2024-11-29 19:22:09.098125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.387 [2024-11-29 19:22:09.098179] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.387 [2024-11-29 19:22:09.098196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.387 #17 NEW cov: 12496 ft: 14900 corp: 10/99b lim: 25 exec/s: 0 rss: 71Mb L: 12/21 MS: 1 ChangeBinInt- 00:08:49.387 [2024-11-29 19:22:09.138238] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.387 [2024-11-29 19:22:09.138264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.387 [2024-11-29 19:22:09.138304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.387 [2024-11-29 19:22:09.138320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.387 #18 NEW cov: 12496 ft: 14954 corp: 11/113b lim: 25 exec/s: 0 rss: 71Mb L: 14/21 MS: 1 CopyPart- 00:08:49.387 [2024-11-29 19:22:09.198399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.387 [2024-11-29 19:22:09.198426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.387 [2024-11-29 19:22:09.198479] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.387 [2024-11-29 19:22:09.198499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.387 #19 NEW cov: 12496 ft: 15033 corp: 12/125b lim: 25 exec/s: 0 rss: 71Mb L: 12/21 MS: 1 ChangeByte- 00:08:49.387 [2024-11-29 19:22:09.258492] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.387 [2024-11-29 19:22:09.258518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.647 #20 NEW cov: 12496 ft: 15083 corp: 13/130b lim: 25 exec/s: 0 rss: 71Mb L: 5/21 MS: 1 ChangeByte- 00:08:49.647 [2024-11-29 19:22:09.319014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.647 [2024-11-29 19:22:09.319041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.647 [2024-11-29 19:22:09.319089] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.647 [2024-11-29 19:22:09.319108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.647 [2024-11-29 19:22:09.319165] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.647 [2024-11-29 19:22:09.319181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.647 [2024-11-29 19:22:09.319233] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:49.647 [2024-11-29 19:22:09.319248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.647 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:49.647 #21 NEW cov: 12513 ft: 15125 corp: 14/152b lim: 25 exec/s: 0 rss: 72Mb L: 22/22 MS: 1 InsertByte- 00:08:49.647 [2024-11-29 19:22:09.379077] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.647 [2024-11-29 19:22:09.379105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.647 [2024-11-29 19:22:09.379153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.647 [2024-11-29 19:22:09.379169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.647 [2024-11-29 19:22:09.379224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.647 [2024-11-29 19:22:09.379239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.647 [2024-11-29 19:22:09.379295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:49.647 [2024-11-29 19:22:09.379311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.647 #22 NEW cov: 12513 ft: 15160 corp: 15/176b lim: 25 exec/s: 22 rss: 72Mb L: 24/24 MS: 1 PersAutoDict- DE: "\377\377"- 00:08:49.647 [2024-11-29 19:22:09.439082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.647 [2024-11-29 19:22:09.439108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.647 [2024-11-29 19:22:09.439164] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.647 [2024-11-29 19:22:09.439182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.647 #23 NEW cov: 12513 ft: 15219 corp: 16/188b lim: 25 exec/s: 23 rss: 72Mb L: 12/24 MS: 1 ShuffleBytes- 00:08:49.647 [2024-11-29 19:22:09.479309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.647 [2024-11-29 19:22:09.479336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.647 [2024-11-29 19:22:09.479388] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.647 [2024-11-29 19:22:09.479409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.647 [2024-11-29 19:22:09.479461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.647 [2024-11-29 19:22:09.479477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.647 #24 NEW cov: 12513 ft: 15276 corp: 17/207b lim: 25 exec/s: 24 rss: 72Mb L: 19/24 MS: 1 InsertRepeatedBytes- 00:08:49.647 [2024-11-29 19:22:09.539296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.647 [2024-11-29 19:22:09.539323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.906 #25 NEW cov: 12513 ft: 15289 corp: 18/213b lim: 25 exec/s: 25 rss: 72Mb L: 6/24 MS: 1 CrossOver- 00:08:49.906 [2024-11-29 19:22:09.579606] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.906 [2024-11-29 19:22:09.579633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.906 [2024-11-29 19:22:09.579697] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.906 [2024-11-29 19:22:09.579713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.906 [2024-11-29 19:22:09.579767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.906 [2024-11-29 19:22:09.579784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.906 #26 NEW cov: 12513 ft: 15333 corp: 19/231b lim: 25 exec/s: 26 rss: 72Mb L: 18/24 MS: 1 InsertByte- 00:08:49.906 [2024-11-29 19:22:09.639831] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.907 [2024-11-29 19:22:09.639859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.907 [2024-11-29 19:22:09.639905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.907 [2024-11-29 19:22:09.639928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.907 [2024-11-29 19:22:09.639984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.907 [2024-11-29 19:22:09.640001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.907 #27 NEW cov: 12513 ft: 15363 corp: 20/250b lim: 25 exec/s: 27 rss: 72Mb L: 19/24 MS: 1 PersAutoDict- DE: "\377\377"- 00:08:49.907 [2024-11-29 19:22:09.679903] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.907 [2024-11-29 19:22:09.679930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.907 [2024-11-29 19:22:09.679969] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.907 [2024-11-29 19:22:09.679985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.907 [2024-11-29 19:22:09.680041] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.907 [2024-11-29 19:22:09.680057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.907 #28 NEW cov: 12513 ft: 15392 corp: 21/268b lim: 25 exec/s: 28 rss: 72Mb L: 18/24 MS: 1 ShuffleBytes- 00:08:49.907 [2024-11-29 19:22:09.740288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.907 [2024-11-29 19:22:09.740314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.907 [2024-11-29 19:22:09.740370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.907 [2024-11-29 19:22:09.740385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.907 [2024-11-29 19:22:09.740438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.907 [2024-11-29 19:22:09.740453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.907 [2024-11-29 19:22:09.740507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:49.907 [2024-11-29 19:22:09.740525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.907 [2024-11-29 19:22:09.740579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:49.907 [2024-11-29 19:22:09.740596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:49.907 #29 NEW cov: 12513 ft: 15450 corp: 22/293b lim: 25 exec/s: 29 rss: 72Mb L: 25/25 MS: 1 CopyPart- 00:08:49.907 [2024-11-29 19:22:09.800246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.907 [2024-11-29 19:22:09.800272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.907 [2024-11-29 19:22:09.800309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.907 [2024-11-29 19:22:09.800325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.907 [2024-11-29 19:22:09.800382] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.907 [2024-11-29 19:22:09.800398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.165 #30 NEW cov: 12513 ft: 15471 corp: 23/312b lim: 25 exec/s: 30 rss: 72Mb L: 19/25 MS: 1 ChangeBit- 00:08:50.165 [2024-11-29 19:22:09.860280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.165 [2024-11-29 19:22:09.860307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.165 [2024-11-29 19:22:09.860346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.165 [2024-11-29 19:22:09.860361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.165 #31 NEW cov: 12513 ft: 15486 corp: 24/324b lim: 25 exec/s: 31 rss: 72Mb L: 12/25 MS: 1 ChangeByte- 00:08:50.165 [2024-11-29 19:22:09.900610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.165 [2024-11-29 19:22:09.900637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.165 [2024-11-29 19:22:09.900694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.165 [2024-11-29 19:22:09.900709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.165 [2024-11-29 19:22:09.900765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:50.165 [2024-11-29 19:22:09.900779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.165 [2024-11-29 19:22:09.900834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:50.165 [2024-11-29 19:22:09.900851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.165 #32 NEW cov: 12513 ft: 15495 corp: 25/348b lim: 25 exec/s: 32 rss: 72Mb L: 24/25 MS: 1 ChangeBit- 00:08:50.165 [2024-11-29 19:22:09.940470] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.166 [2024-11-29 19:22:09.940496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.166 [2024-11-29 19:22:09.940534] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.166 [2024-11-29 19:22:09.940549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.166 #33 NEW cov: 12513 ft: 15506 corp: 26/361b lim: 25 exec/s: 33 rss: 72Mb L: 13/25 MS: 1 CrossOver- 00:08:50.166 [2024-11-29 19:22:09.980521] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.166 [2024-11-29 19:22:09.980548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.166 #34 NEW cov: 12513 ft: 15573 corp: 27/366b lim: 25 exec/s: 34 rss: 72Mb L: 5/25 MS: 1 ChangeBinInt- 00:08:50.166 [2024-11-29 19:22:10.041497] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.166 [2024-11-29 19:22:10.041529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.166 [2024-11-29 19:22:10.041577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.166 [2024-11-29 19:22:10.041595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.166 #35 NEW cov: 12513 ft: 15591 corp: 28/378b lim: 25 exec/s: 35 rss: 73Mb L: 12/25 MS: 1 ChangeBit- 00:08:50.424 [2024-11-29 19:22:10.081061] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.424 [2024-11-29 19:22:10.081091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.424 [2024-11-29 19:22:10.081127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.424 [2024-11-29 19:22:10.081143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.424 [2024-11-29 19:22:10.081199] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:50.424 [2024-11-29 19:22:10.081215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.424 #36 NEW cov: 12513 ft: 15633 corp: 29/397b lim: 25 exec/s: 36 rss: 73Mb L: 19/25 MS: 1 InsertByte- 00:08:50.424 [2024-11-29 19:22:10.141182] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.424 [2024-11-29 19:22:10.141211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.424 [2024-11-29 19:22:10.141264] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.424 [2024-11-29 19:22:10.141281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.424 [2024-11-29 19:22:10.141336] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:50.424 [2024-11-29 19:22:10.141352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.424 #37 NEW cov: 12513 ft: 15669 corp: 30/416b lim: 25 exec/s: 37 rss: 73Mb L: 19/25 MS: 1 CMP- DE: "\036\000\000\000"- 00:08:50.424 [2024-11-29 19:22:10.201236] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.424 [2024-11-29 19:22:10.201263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.424 [2024-11-29 19:22:10.201300] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.424 [2024-11-29 19:22:10.201316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.424 #38 NEW cov: 12513 ft: 15686 corp: 31/428b lim: 25 exec/s: 38 rss: 73Mb L: 12/25 MS: 1 CopyPart- 00:08:50.424 [2024-11-29 19:22:10.261423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.424 [2024-11-29 19:22:10.261454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.424 [2024-11-29 19:22:10.261492] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.424 [2024-11-29 19:22:10.261508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.424 #39 NEW cov: 12513 ft: 15698 corp: 32/441b lim: 25 exec/s: 39 rss: 73Mb L: 13/25 MS: 1 InsertByte- 00:08:50.424 [2024-11-29 19:22:10.301613] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.424 [2024-11-29 19:22:10.301641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.424 [2024-11-29 19:22:10.301705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.424 [2024-11-29 19:22:10.301722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.424 [2024-11-29 19:22:10.301776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:50.424 [2024-11-29 19:22:10.301793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.683 #40 NEW cov: 12520 ft: 15715 corp: 33/459b lim: 25 exec/s: 40 rss: 73Mb L: 18/25 MS: 1 InsertRepeatedBytes- 00:08:50.683 [2024-11-29 19:22:10.361848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.683 [2024-11-29 19:22:10.361875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.683 [2024-11-29 19:22:10.361913] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.683 [2024-11-29 19:22:10.361929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.683 [2024-11-29 19:22:10.361985] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:50.683 [2024-11-29 19:22:10.362001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.683 #41 NEW cov: 12520 ft: 15718 corp: 34/477b lim: 25 exec/s: 20 rss: 73Mb L: 18/25 MS: 1 EraseBytes- 00:08:50.683 #41 DONE cov: 12520 ft: 15718 corp: 34/477b lim: 25 exec/s: 20 rss: 73Mb 00:08:50.683 ###### Recommended dictionary. ###### 00:08:50.683 "\377\377" # Uses: 2 00:08:50.683 "\036\000\000\000" # Uses: 0 00:08:50.683 ###### End of recommended dictionary. ###### 00:08:50.683 Done 41 runs in 2 second(s) 00:08:50.683 19:22:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_23.conf /var/tmp/suppress_nvmf_fuzz 00:08:50.683 19:22:10 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:50.683 19:22:10 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:50.683 19:22:10 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:50.683 19:22:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:50.683 19:22:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:50.683 19:22:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:50.683 19:22:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:50.683 19:22:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:50.683 19:22:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:50.683 19:22:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:50.683 19:22:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 24 00:08:50.683 19:22:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4424 00:08:50.683 19:22:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:50.683 19:22:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:50.683 19:22:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:50.683 19:22:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:50.683 19:22:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:50.683 19:22:10 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 00:08:50.683 [2024-11-29 19:22:10.550929] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:50.683 [2024-11-29 19:22:10.551002] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1756364 ] 00:08:50.941 [2024-11-29 19:22:10.739212] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:50.941 [2024-11-29 19:22:10.751791] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:50.941 [2024-11-29 19:22:10.804629] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:50.941 [2024-11-29 19:22:10.821003] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:50.941 INFO: Running with entropic power schedule (0xFF, 100). 00:08:50.941 INFO: Seed: 1312652601 00:08:51.199 INFO: Loaded 1 modules (389765 inline 8-bit counters): 389765 [0x2afee8c, 0x2b5e111), 00:08:51.199 INFO: Loaded 1 PC tables (389765 PCs): 389765 [0x2b5e118,0x3150968), 00:08:51.199 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:51.199 INFO: A corpus is not provided, starting from an empty corpus 00:08:51.199 #2 INITED exec/s: 0 rss: 64Mb 00:08:51.199 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:51.199 This may also happen if the target rejected all inputs we tried so far 00:08:51.199 [2024-11-29 19:22:10.886411] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1302123110951162386 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.199 [2024-11-29 19:22:10.886444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.199 [2024-11-29 19:22:10.886500] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1302123111085380114 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.199 [2024-11-29 19:22:10.886516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.458 NEW_FUNC[1/718]: 0x4851a8 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:51.458 NEW_FUNC[2/718]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:51.458 #8 NEW cov: 12364 ft: 12362 corp: 2/45b lim: 100 exec/s: 0 rss: 72Mb L: 44/44 MS: 1 InsertRepeatedBytes- 00:08:51.458 [2024-11-29 19:22:11.217358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6498705335406431028 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.458 [2024-11-29 19:22:11.217413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.458 [2024-11-29 19:22:11.217495] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1302123111085380114 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.458 [2024-11-29 19:22:11.217523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.458 #17 NEW cov: 12477 ft: 12907 corp: 3/93b lim: 100 exec/s: 0 rss: 72Mb L: 48/48 MS: 4 CMP-ChangeBit-ChangeBinInt-CrossOver- DE: "(\326\0134Z \224\000"- 00:08:51.458 [2024-11-29 19:22:11.257242] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6498705335406431028 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.458 [2024-11-29 19:22:11.257272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.458 [2024-11-29 19:22:11.257315] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1302123111085380114 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.458 [2024-11-29 19:22:11.257330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.458 #18 NEW cov: 12483 ft: 13083 corp: 4/141b lim: 100 exec/s: 0 rss: 72Mb L: 48/48 MS: 1 ShuffleBytes- 00:08:51.458 [2024-11-29 19:22:11.317267] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1302123110951162386 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.458 [2024-11-29 19:22:11.317296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.458 #19 NEW cov: 12568 ft: 14197 corp: 5/178b lim: 100 exec/s: 0 rss: 72Mb L: 37/48 MS: 1 EraseBytes- 00:08:51.716 [2024-11-29 19:22:11.377608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6498705335406431028 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.716 [2024-11-29 19:22:11.377638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.716 [2024-11-29 19:22:11.377673] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1302123111085380114 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.716 [2024-11-29 19:22:11.377688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.716 #20 NEW cov: 12568 ft: 14450 corp: 6/226b lim: 100 exec/s: 0 rss: 72Mb L: 48/48 MS: 1 CopyPart- 00:08:51.716 [2024-11-29 19:22:11.417511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1302123111324988434 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.716 [2024-11-29 19:22:11.417539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.716 #24 NEW cov: 12568 ft: 14536 corp: 7/252b lim: 100 exec/s: 0 rss: 72Mb L: 26/48 MS: 4 PersAutoDict-ShuffleBytes-EraseBytes-CrossOver- DE: "(\326\0134Z \224\000"- 00:08:51.716 [2024-11-29 19:22:11.457641] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1302123111324988434 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.716 [2024-11-29 19:22:11.457669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.716 #25 NEW cov: 12568 ft: 14590 corp: 8/279b lim: 100 exec/s: 0 rss: 72Mb L: 27/48 MS: 1 InsertByte- 00:08:51.716 [2024-11-29 19:22:11.517826] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1301560160997741074 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.716 [2024-11-29 19:22:11.517855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.716 #26 NEW cov: 12568 ft: 14692 corp: 9/316b lim: 100 exec/s: 0 rss: 72Mb L: 37/48 MS: 1 ChangeBinInt- 00:08:51.716 [2024-11-29 19:22:11.577998] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1302123111324988434 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.716 [2024-11-29 19:22:11.578025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.716 #27 NEW cov: 12568 ft: 14730 corp: 10/343b lim: 100 exec/s: 0 rss: 72Mb L: 27/48 MS: 1 InsertByte- 00:08:51.716 [2024-11-29 19:22:11.618088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1301560160997741074 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.716 [2024-11-29 19:22:11.618116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.974 #28 NEW cov: 12568 ft: 14816 corp: 11/380b lim: 100 exec/s: 0 rss: 72Mb L: 37/48 MS: 1 ChangeBit- 00:08:51.974 [2024-11-29 19:22:11.678272] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1302123111320261138 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.974 [2024-11-29 19:22:11.678301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.974 #29 NEW cov: 12568 ft: 14860 corp: 12/407b lim: 100 exec/s: 0 rss: 73Mb L: 27/48 MS: 1 CopyPart- 00:08:51.974 [2024-11-29 19:22:11.738572] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6498705335406431028 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.974 [2024-11-29 19:22:11.738604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.974 [2024-11-29 19:22:11.738644] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1302123111085380114 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.974 [2024-11-29 19:22:11.738660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.974 NEW_FUNC[1/1]: 0x1c65ac8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:51.974 #30 NEW cov: 12591 ft: 14957 corp: 13/456b lim: 100 exec/s: 0 rss: 73Mb L: 49/49 MS: 1 InsertByte- 00:08:51.974 [2024-11-29 19:22:11.778507] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1302123110951162386 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.974 [2024-11-29 19:22:11.778534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.974 #31 NEW cov: 12591 ft: 14976 corp: 14/495b lim: 100 exec/s: 0 rss: 73Mb L: 39/49 MS: 1 CMP- DE: "\030?"- 00:08:51.974 [2024-11-29 19:22:11.818632] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1301560160997741074 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.974 [2024-11-29 19:22:11.818659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.974 #32 NEW cov: 12591 ft: 15010 corp: 15/530b lim: 100 exec/s: 0 rss: 73Mb L: 35/49 MS: 1 EraseBytes- 00:08:51.974 [2024-11-29 19:22:11.858697] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6498705335406431028 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.974 [2024-11-29 19:22:11.858725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.232 #33 NEW cov: 12591 ft: 15033 corp: 16/557b lim: 100 exec/s: 33 rss: 73Mb L: 27/49 MS: 1 EraseBytes- 00:08:52.232 [2024-11-29 19:22:11.918920] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1302123110951162386 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.232 [2024-11-29 19:22:11.918948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.232 #34 NEW cov: 12591 ft: 15052 corp: 17/594b lim: 100 exec/s: 34 rss: 73Mb L: 37/49 MS: 1 CopyPart- 00:08:52.232 [2024-11-29 19:22:11.959003] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:538640384 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.232 [2024-11-29 19:22:11.959031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.232 #35 NEW cov: 12591 ft: 15086 corp: 18/621b lim: 100 exec/s: 35 rss: 73Mb L: 27/49 MS: 1 ChangeBinInt- 00:08:52.232 [2024-11-29 19:22:11.999136] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6498705335406431028 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.232 [2024-11-29 19:22:11.999165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.232 [2024-11-29 19:22:12.059453] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6498705335406431028 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.232 [2024-11-29 19:22:12.059481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.232 [2024-11-29 19:22:12.059543] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1302123196682211858 len:9767 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.232 [2024-11-29 19:22:12.059560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.232 #37 NEW cov: 12591 ft: 15096 corp: 19/673b lim: 100 exec/s: 37 rss: 73Mb L: 52/52 MS: 2 PersAutoDict-InsertRepeatedBytes- DE: "(\326\0134Z \224\000"- 00:08:52.232 [2024-11-29 19:22:12.099578] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6498705335406431028 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.232 [2024-11-29 19:22:12.099612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.232 [2024-11-29 19:22:12.099652] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:77359742976 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.232 [2024-11-29 19:22:12.099667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.490 #38 NEW cov: 12591 ft: 15125 corp: 20/722b lim: 100 exec/s: 38 rss: 73Mb L: 49/52 MS: 1 CMP- DE: "\377\003\000\000\000\000\000\000"- 00:08:52.490 [2024-11-29 19:22:12.159883] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1301560160997741074 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.490 [2024-11-29 19:22:12.159910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.490 [2024-11-29 19:22:12.159948] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1302123385963287058 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.490 [2024-11-29 19:22:12.159963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.490 [2024-11-29 19:22:12.160020] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1302123111085380114 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.490 [2024-11-29 19:22:12.160036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.490 #39 NEW cov: 12591 ft: 15471 corp: 21/785b lim: 100 exec/s: 39 rss: 73Mb L: 63/63 MS: 1 CopyPart- 00:08:52.490 [2024-11-29 19:22:12.219917] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6498705335392415796 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.490 [2024-11-29 19:22:12.219946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.490 [2024-11-29 19:22:12.219993] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1302123111085380114 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.490 [2024-11-29 19:22:12.220009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.490 #40 NEW cov: 12591 ft: 15479 corp: 22/833b lim: 100 exec/s: 40 rss: 73Mb L: 48/63 MS: 1 ChangeBinInt- 00:08:52.490 [2024-11-29 19:22:12.259880] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6498705335406431028 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.490 [2024-11-29 19:22:12.259911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.490 #41 NEW cov: 12591 ft: 15483 corp: 23/860b lim: 100 exec/s: 41 rss: 73Mb L: 27/63 MS: 1 ChangeByte- 00:08:52.490 [2024-11-29 19:22:12.300297] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1301560160997741074 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.490 [2024-11-29 19:22:12.300325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.490 [2024-11-29 19:22:12.300378] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1302123385963287058 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.490 [2024-11-29 19:22:12.300394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.490 [2024-11-29 19:22:12.300450] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1302123111085380114 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.491 [2024-11-29 19:22:12.300466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.491 #42 NEW cov: 12591 ft: 15493 corp: 24/923b lim: 100 exec/s: 42 rss: 73Mb L: 63/63 MS: 1 CopyPart- 00:08:52.491 [2024-11-29 19:22:12.360290] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1302123110951162386 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.491 [2024-11-29 19:22:12.360318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.491 [2024-11-29 19:22:12.360351] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1302123111085380114 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.491 [2024-11-29 19:22:12.360368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.748 #48 NEW cov: 12591 ft: 15549 corp: 25/963b lim: 100 exec/s: 48 rss: 73Mb L: 40/63 MS: 1 InsertByte- 00:08:52.748 [2024-11-29 19:22:12.420637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1301560160997741074 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.748 [2024-11-29 19:22:12.420667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.748 [2024-11-29 19:22:12.420704] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1302123385963287058 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.748 [2024-11-29 19:22:12.420719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.748 [2024-11-29 19:22:12.420775] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1302123111085380114 len:11283 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.748 [2024-11-29 19:22:12.420791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.748 #54 NEW cov: 12591 ft: 15554 corp: 26/1026b lim: 100 exec/s: 54 rss: 74Mb L: 63/63 MS: 1 ChangeByte- 00:08:52.748 [2024-11-29 19:22:12.480811] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.748 [2024-11-29 19:22:12.480840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.748 [2024-11-29 19:22:12.480877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.748 [2024-11-29 19:22:12.480894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.748 [2024-11-29 19:22:12.480954] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1301560160997741074 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.748 [2024-11-29 19:22:12.480971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.748 #55 NEW cov: 12591 ft: 15607 corp: 27/1101b lim: 100 exec/s: 55 rss: 74Mb L: 75/75 MS: 1 InsertRepeatedBytes- 00:08:52.748 [2024-11-29 19:22:12.540980] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1301560160997741842 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.748 [2024-11-29 19:22:12.541007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.748 [2024-11-29 19:22:12.541059] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1302123385963287058 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.748 [2024-11-29 19:22:12.541075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.748 [2024-11-29 19:22:12.541131] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:1302123111085380114 len:11283 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.748 [2024-11-29 19:22:12.541148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.748 #56 NEW cov: 12591 ft: 15617 corp: 28/1164b lim: 100 exec/s: 56 rss: 74Mb L: 63/75 MS: 1 ChangeBinInt- 00:08:52.748 [2024-11-29 19:22:12.600988] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1302123110951162386 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.748 [2024-11-29 19:22:12.601016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.748 [2024-11-29 19:22:12.601068] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1302123111085380114 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.748 [2024-11-29 19:22:12.601084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.748 #57 NEW cov: 12591 ft: 15621 corp: 29/1204b lim: 100 exec/s: 57 rss: 74Mb L: 40/75 MS: 1 ChangeByte- 00:08:53.007 [2024-11-29 19:22:12.661005] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1302215469927895570 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.007 [2024-11-29 19:22:12.661033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.007 #58 NEW cov: 12591 ft: 15680 corp: 30/1242b lim: 100 exec/s: 58 rss: 74Mb L: 38/75 MS: 1 InsertByte- 00:08:53.007 [2024-11-29 19:22:12.701277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6498705335406431028 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.007 [2024-11-29 19:22:12.701305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.007 [2024-11-29 19:22:12.701340] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1302123111085396498 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.007 [2024-11-29 19:22:12.701355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.007 #59 NEW cov: 12591 ft: 15717 corp: 31/1291b lim: 100 exec/s: 59 rss: 74Mb L: 49/75 MS: 1 ChangeByte- 00:08:53.007 [2024-11-29 19:22:12.741250] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1302123110951162386 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.007 [2024-11-29 19:22:12.741279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.007 #60 NEW cov: 12591 ft: 15862 corp: 32/1330b lim: 100 exec/s: 60 rss: 74Mb L: 39/75 MS: 1 CopyPart- 00:08:53.007 [2024-11-29 19:22:12.781506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6498705335406431028 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.007 [2024-11-29 19:22:12.781535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.007 [2024-11-29 19:22:12.781577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1302123111085380114 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.007 [2024-11-29 19:22:12.781592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.007 #61 NEW cov: 12591 ft: 15881 corp: 33/1385b lim: 100 exec/s: 61 rss: 74Mb L: 55/75 MS: 1 InsertRepeatedBytes- 00:08:53.007 [2024-11-29 19:22:12.841486] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1302123111320288786 len:4627 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.007 [2024-11-29 19:22:12.841515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.007 #62 NEW cov: 12591 ft: 15885 corp: 34/1413b lim: 100 exec/s: 31 rss: 74Mb L: 28/75 MS: 1 InsertByte- 00:08:53.007 #62 DONE cov: 12591 ft: 15885 corp: 34/1413b lim: 100 exec/s: 31 rss: 74Mb 00:08:53.007 ###### Recommended dictionary. ###### 00:08:53.007 "(\326\0134Z \224\000" # Uses: 2 00:08:53.007 "\030?" # Uses: 0 00:08:53.007 "\377\003\000\000\000\000\000\000" # Uses: 0 00:08:53.007 ###### End of recommended dictionary. ###### 00:08:53.007 Done 62 runs in 2 second(s) 00:08:53.265 19:22:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_24.conf /var/tmp/suppress_nvmf_fuzz 00:08:53.265 19:22:12 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:53.265 19:22:12 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:53.265 19:22:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@79 -- # trap - SIGINT SIGTERM EXIT 00:08:53.265 00:08:53.265 real 1m2.418s 00:08:53.265 user 1m38.969s 00:08:53.265 sys 0m7.232s 00:08:53.265 19:22:12 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:53.265 19:22:12 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:53.265 ************************************ 00:08:53.265 END TEST nvmf_llvm_fuzz 00:08:53.265 ************************************ 00:08:53.265 19:22:13 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:53.265 19:22:13 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:53.265 19:22:13 llvm_fuzz -- fuzz/llvm.sh@20 -- # run_test vfio_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:53.265 19:22:13 llvm_fuzz -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:53.265 19:22:13 llvm_fuzz -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:53.265 19:22:13 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:53.265 ************************************ 00:08:53.265 START TEST vfio_llvm_fuzz 00:08:53.265 ************************************ 00:08:53.265 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:53.265 * Looking for test storage... 00:08:53.265 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:53.265 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:08:53.265 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:08:53.265 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:08:53.527 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:53.527 --rc genhtml_branch_coverage=1 00:08:53.527 --rc genhtml_function_coverage=1 00:08:53.527 --rc genhtml_legend=1 00:08:53.527 --rc geninfo_all_blocks=1 00:08:53.527 --rc geninfo_unexecuted_blocks=1 00:08:53.527 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:53.527 ' 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:08:53.527 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:53.527 --rc genhtml_branch_coverage=1 00:08:53.527 --rc genhtml_function_coverage=1 00:08:53.527 --rc genhtml_legend=1 00:08:53.527 --rc geninfo_all_blocks=1 00:08:53.527 --rc geninfo_unexecuted_blocks=1 00:08:53.527 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:53.527 ' 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:08:53.527 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:53.527 --rc genhtml_branch_coverage=1 00:08:53.527 --rc genhtml_function_coverage=1 00:08:53.527 --rc genhtml_legend=1 00:08:53.527 --rc geninfo_all_blocks=1 00:08:53.527 --rc geninfo_unexecuted_blocks=1 00:08:53.527 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:53.527 ' 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:08:53.527 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:53.527 --rc genhtml_branch_coverage=1 00:08:53.527 --rc genhtml_function_coverage=1 00:08:53.527 --rc genhtml_legend=1 00:08:53.527 --rc geninfo_all_blocks=1 00:08:53.527 --rc geninfo_unexecuted_blocks=1 00:08:53.527 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:53.527 ' 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@64 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_MAX_NUMA_NODES=1 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_PGO_CAPTURE=n 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_LTO=n 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_ISCSI_INITIATOR=y 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_CET=n 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_OCF_PATH= 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_RDMA_SET_TOS=y 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_AIO_FSDEV=y 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_UBLK=y 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_ISAL_CRYPTO=y 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OPENSSL_PATH= 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_OCF=n 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_FUSE=n 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_VTUNE_DIR= 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FUZZER=y 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_FSDEV=y 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_CRYPTO=n 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_PGO_USE=n 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_VHOST=y 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DAOS=n 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_DAOS_DIR= 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_UNIT_TESTS=n 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_VIRTIO=y 00:08:53.527 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_DPDK_UADK=n 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_COVERAGE=y 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_RDMA=y 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_HAVE_LZ4=n 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_URING_PATH= 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_XNVME=n 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_VFIO_USER=y 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_ARCH=native 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_HAVE_EVP_MAC=y 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_URING_ZNS=n 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_WERROR=y 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_HAVE_LIBBSD=n 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_UBSAN=y 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_IPSEC_MB_DIR= 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_GOLANG=n 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_ISAL=y 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_IDXD_KERNEL=y 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_RDMA_PROV=verbs 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_APPS=y 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_SHARED=n 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_HAVE_KEYUTILS=y 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_FC_PATH= 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_FC=n 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_AVAHI=n 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_FIO_PLUGIN=y 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_RAID5F=n 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_EXAMPLES=y 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_TESTS=y 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_CRYPTO_MLX5=n 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_MAX_LCORES=128 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_IPSEC_MB=n 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_PGO_DIR= 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DEBUG=y 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_CROSS_PREFIX= 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_COPY_FILE_RANGE=y 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@90 -- # CONFIG_URING=n 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:53.528 #define SPDK_CONFIG_H 00:08:53.528 #define SPDK_CONFIG_AIO_FSDEV 1 00:08:53.528 #define SPDK_CONFIG_APPS 1 00:08:53.528 #define SPDK_CONFIG_ARCH native 00:08:53.528 #undef SPDK_CONFIG_ASAN 00:08:53.528 #undef SPDK_CONFIG_AVAHI 00:08:53.528 #undef SPDK_CONFIG_CET 00:08:53.528 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:08:53.528 #define SPDK_CONFIG_COVERAGE 1 00:08:53.528 #define SPDK_CONFIG_CROSS_PREFIX 00:08:53.528 #undef SPDK_CONFIG_CRYPTO 00:08:53.528 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:53.528 #undef SPDK_CONFIG_CUSTOMOCF 00:08:53.528 #undef SPDK_CONFIG_DAOS 00:08:53.528 #define SPDK_CONFIG_DAOS_DIR 00:08:53.528 #define SPDK_CONFIG_DEBUG 1 00:08:53.528 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:53.528 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:53.528 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:53.528 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:53.528 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:53.528 #undef SPDK_CONFIG_DPDK_UADK 00:08:53.528 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:53.528 #define SPDK_CONFIG_EXAMPLES 1 00:08:53.528 #undef SPDK_CONFIG_FC 00:08:53.528 #define SPDK_CONFIG_FC_PATH 00:08:53.528 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:53.528 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:53.528 #define SPDK_CONFIG_FSDEV 1 00:08:53.528 #undef SPDK_CONFIG_FUSE 00:08:53.528 #define SPDK_CONFIG_FUZZER 1 00:08:53.528 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:53.528 #undef SPDK_CONFIG_GOLANG 00:08:53.528 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:53.528 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:08:53.528 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:53.528 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:08:53.528 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:53.528 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:53.528 #undef SPDK_CONFIG_HAVE_LZ4 00:08:53.528 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:08:53.528 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:08:53.528 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:53.528 #define SPDK_CONFIG_IDXD 1 00:08:53.528 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:53.528 #undef SPDK_CONFIG_IPSEC_MB 00:08:53.528 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:53.528 #define SPDK_CONFIG_ISAL 1 00:08:53.528 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:53.528 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:53.528 #define SPDK_CONFIG_LIBDIR 00:08:53.528 #undef SPDK_CONFIG_LTO 00:08:53.528 #define SPDK_CONFIG_MAX_LCORES 128 00:08:53.528 #define SPDK_CONFIG_MAX_NUMA_NODES 1 00:08:53.528 #define SPDK_CONFIG_NVME_CUSE 1 00:08:53.528 #undef SPDK_CONFIG_OCF 00:08:53.528 #define SPDK_CONFIG_OCF_PATH 00:08:53.528 #define SPDK_CONFIG_OPENSSL_PATH 00:08:53.528 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:53.528 #define SPDK_CONFIG_PGO_DIR 00:08:53.528 #undef SPDK_CONFIG_PGO_USE 00:08:53.528 #define SPDK_CONFIG_PREFIX /usr/local 00:08:53.528 #undef SPDK_CONFIG_RAID5F 00:08:53.528 #undef SPDK_CONFIG_RBD 00:08:53.528 #define SPDK_CONFIG_RDMA 1 00:08:53.528 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:53.528 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:53.528 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:53.528 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:53.528 #undef SPDK_CONFIG_SHARED 00:08:53.528 #undef SPDK_CONFIG_SMA 00:08:53.528 #define SPDK_CONFIG_TESTS 1 00:08:53.528 #undef SPDK_CONFIG_TSAN 00:08:53.528 #define SPDK_CONFIG_UBLK 1 00:08:53.528 #define SPDK_CONFIG_UBSAN 1 00:08:53.528 #undef SPDK_CONFIG_UNIT_TESTS 00:08:53.528 #undef SPDK_CONFIG_URING 00:08:53.528 #define SPDK_CONFIG_URING_PATH 00:08:53.528 #undef SPDK_CONFIG_URING_ZNS 00:08:53.528 #undef SPDK_CONFIG_USDT 00:08:53.528 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:53.528 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:53.528 #define SPDK_CONFIG_VFIO_USER 1 00:08:53.528 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:53.528 #define SPDK_CONFIG_VHOST 1 00:08:53.528 #define SPDK_CONFIG_VIRTIO 1 00:08:53.528 #undef SPDK_CONFIG_VTUNE 00:08:53.528 #define SPDK_CONFIG_VTUNE_DIR 00:08:53.528 #define SPDK_CONFIG_WERROR 1 00:08:53.528 #define SPDK_CONFIG_WPDK_DIR 00:08:53.528 #undef SPDK_CONFIG_XNVME 00:08:53.528 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:53.528 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # uname -s 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:08:53.529 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@140 -- # : v23.11 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@177 -- # : 0 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@178 -- # export SPDK_TEST_NVME_INTERRUPT 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@183 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@183 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@184 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@184 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@187 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@187 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@191 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@191 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@195 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@195 -- # PYTHONDONTWRITEBYTECODE=1 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@199 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@199 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@200 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@200 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@204 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@205 -- # rm -rf /var/tmp/asan_suppression_file 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@206 -- # cat 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@242 -- # echo leak:libfuse3.so 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@246 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@246 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@248 -- # '[' -z /var/spdk/dependencies ']' 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@251 -- # export DEPENDENCY_DIR 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@255 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@255 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@256 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@256 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@259 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@259 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@262 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@262 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@265 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@265 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV_MAIN=0 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV_LLVM=1 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@269 -- # _LCOV= 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ '' == *clang* ]] 00:08:53.530 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ 1 -eq 1 ]] 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # _LCOV=1 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@272 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@273 -- # _lcov_opt[_LCOV_MAIN]= 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@275 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@278 -- # '[' 0 -eq 0 ']' 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@279 -- # export valgrind= 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@279 -- # valgrind= 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # uname -s 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # '[' Linux = Linux ']' 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@286 -- # HUGEMEM=4096 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@287 -- # export CLEAR_HUGE=yes 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@287 -- # CLEAR_HUGE=yes 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@289 -- # MAKE=make 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@290 -- # MAKEFLAGS=-j112 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@306 -- # export HUGEMEM=4096 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@306 -- # HUGEMEM=4096 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@308 -- # NO_HUGE=() 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@309 -- # TEST_MODE= 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@331 -- # [[ -z 1756928 ]] 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@331 -- # kill -0 1756928 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1678 -- # set_test_storage 2147483648 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@341 -- # [[ -v testdir ]] 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@343 -- # local requested_size=2147483648 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@344 -- # local mount target_dir 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@346 -- # local -A mounts fss sizes avails uses 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@347 -- # local source fs size avail mount use 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@349 -- # local storage_fallback storage_candidates 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@351 -- # mktemp -udt spdk.XXXXXX 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@351 -- # storage_fallback=/tmp/spdk.YUCBO8 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@356 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@358 -- # [[ -n '' ]] 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@363 -- # [[ -n '' ]] 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@368 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.YUCBO8/tests/vfio /tmp/spdk.YUCBO8 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # requested_size=2214592512 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@340 -- # df -T 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@340 -- # grep -v Filesystem 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_devtmpfs 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=devtmpfs 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=67108864 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=67108864 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=0 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=/dev/pmem0 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=ext2 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=4096 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=5284429824 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5284425728 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_root 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=overlay 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=51104579584 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=61730607104 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=10626027520 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30860537856 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865301504 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=4763648 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=12340129792 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=12346122240 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5992448 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30863355904 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865305600 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=1949696 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=6173044736 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=6173057024 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=12288 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@379 -- # printf '* Looking for test storage...\n' 00:08:53.531 * Looking for test storage... 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@381 -- # local target_space new_size 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@382 -- # for target_dir in "${storage_candidates[@]}" 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # mount=/ 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@387 -- # target_space=51104579584 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@388 -- # (( target_space == 0 || target_space < requested_size )) 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # (( target_space >= requested_size )) 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == tmpfs ]] 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == ramfs ]] 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ / == / ]] 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@394 -- # new_size=12840620032 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@395 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:53.531 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@400 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:53.532 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@400 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:53.532 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@401 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:53.532 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:53.532 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@402 -- # return 0 00:08:53.532 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1680 -- # set -o errtrace 00:08:53.532 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # shopt -s extdebug 00:08:53.532 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1682 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:53.532 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1684 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:53.532 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1685 -- # true 00:08:53.532 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1687 -- # xtrace_fd 00:08:53.532 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:53.532 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:53.532 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:08:53.532 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:08:53.532 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:53.532 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:53.532 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:53.532 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:08:53.532 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:08:53.532 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:08:53.532 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:08:53.791 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:08:53.791 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:53.791 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:53.791 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:53.791 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:53.791 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:53.791 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:53.791 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:53.791 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:53.791 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:53.791 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:53.791 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:53.791 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:53.791 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:08:53.792 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:53.792 --rc genhtml_branch_coverage=1 00:08:53.792 --rc genhtml_function_coverage=1 00:08:53.792 --rc genhtml_legend=1 00:08:53.792 --rc geninfo_all_blocks=1 00:08:53.792 --rc geninfo_unexecuted_blocks=1 00:08:53.792 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:53.792 ' 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:08:53.792 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:53.792 --rc genhtml_branch_coverage=1 00:08:53.792 --rc genhtml_function_coverage=1 00:08:53.792 --rc genhtml_legend=1 00:08:53.792 --rc geninfo_all_blocks=1 00:08:53.792 --rc geninfo_unexecuted_blocks=1 00:08:53.792 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:53.792 ' 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:08:53.792 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:53.792 --rc genhtml_branch_coverage=1 00:08:53.792 --rc genhtml_function_coverage=1 00:08:53.792 --rc genhtml_legend=1 00:08:53.792 --rc geninfo_all_blocks=1 00:08:53.792 --rc geninfo_unexecuted_blocks=1 00:08:53.792 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:53.792 ' 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:08:53.792 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:53.792 --rc genhtml_branch_coverage=1 00:08:53.792 --rc genhtml_function_coverage=1 00:08:53.792 --rc genhtml_legend=1 00:08:53.792 --rc geninfo_all_blocks=1 00:08:53.792 --rc geninfo_unexecuted_blocks=1 00:08:53.792 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:53.792 ' 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@65 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@67 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # fuzz_num=7 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@69 -- # (( fuzz_num != 0 )) 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@71 -- # trap 'cleanup /tmp/vfio-user-* /var/tmp/suppress_vfio_fuzz; exit 1' SIGINT SIGTERM EXIT 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@74 -- # mem_size=0 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@75 -- # [[ 1 -eq 1 ]] 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@76 -- # start_llvm_fuzz_short 7 1 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=7 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:53.792 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:53.792 19:22:13 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:53.792 [2024-11-29 19:22:13.523686] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:53.792 [2024-11-29 19:22:13.523756] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1756986 ] 00:08:53.792 [2024-11-29 19:22:13.600886] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:53.792 [2024-11-29 19:22:13.622771] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:54.051 INFO: Running with entropic power schedule (0xFF, 100). 00:08:54.051 INFO: Seed: 4275642458 00:08:54.051 INFO: Loaded 1 modules (387001 inline 8-bit counters): 387001 [0x2ac06cc, 0x2b1ee85), 00:08:54.051 INFO: Loaded 1 PC tables (387001 PCs): 387001 [0x2b1ee88,0x3106a18), 00:08:54.051 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:54.052 INFO: A corpus is not provided, starting from an empty corpus 00:08:54.052 #2 INITED exec/s: 0 rss: 66Mb 00:08:54.052 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:54.052 This may also happen if the target rejected all inputs we tried so far 00:08:54.052 [2024-11-29 19:22:13.852983] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: enabling controller 00:08:54.568 NEW_FUNC[1/676]: 0x459068 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:84 00:08:54.568 NEW_FUNC[2/676]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:54.568 #12 NEW cov: 11242 ft: 11207 corp: 2/7b lim: 6 exec/s: 0 rss: 73Mb L: 6/6 MS: 5 ShuffleBytes-InsertByte-CopyPart-ChangeBit-InsertRepeatedBytes- 00:08:54.827 #13 NEW cov: 11256 ft: 14288 corp: 3/13b lim: 6 exec/s: 0 rss: 74Mb L: 6/6 MS: 1 ChangeByte- 00:08:54.827 NEW_FUNC[1/1]: 0x1c31f18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:54.827 #14 NEW cov: 11273 ft: 14818 corp: 4/19b lim: 6 exec/s: 0 rss: 75Mb L: 6/6 MS: 1 ChangeByte- 00:08:55.085 #15 NEW cov: 11273 ft: 15628 corp: 5/25b lim: 6 exec/s: 0 rss: 75Mb L: 6/6 MS: 1 ChangeByte- 00:08:55.345 #16 NEW cov: 11273 ft: 16229 corp: 6/31b lim: 6 exec/s: 16 rss: 75Mb L: 6/6 MS: 1 CrossOver- 00:08:55.345 #17 NEW cov: 11273 ft: 16345 corp: 7/37b lim: 6 exec/s: 17 rss: 75Mb L: 6/6 MS: 1 ChangeByte- 00:08:55.602 #18 NEW cov: 11273 ft: 16562 corp: 8/43b lim: 6 exec/s: 18 rss: 75Mb L: 6/6 MS: 1 CopyPart- 00:08:55.602 #19 NEW cov: 11273 ft: 17628 corp: 9/49b lim: 6 exec/s: 19 rss: 75Mb L: 6/6 MS: 1 CrossOver- 00:08:55.860 #20 NEW cov: 11280 ft: 17729 corp: 10/55b lim: 6 exec/s: 20 rss: 75Mb L: 6/6 MS: 1 CrossOver- 00:08:56.119 #21 NEW cov: 11280 ft: 17985 corp: 11/61b lim: 6 exec/s: 10 rss: 75Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:56.119 #21 DONE cov: 11280 ft: 17985 corp: 11/61b lim: 6 exec/s: 10 rss: 75Mb 00:08:56.119 Done 21 runs in 2 second(s) 00:08:56.119 [2024-11-29 19:22:15.852808] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: disabling controller 00:08:56.378 19:22:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-0 /var/tmp/suppress_vfio_fuzz 00:08:56.378 19:22:16 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:56.378 19:22:16 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:56.378 19:22:16 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:56.378 19:22:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:56.378 19:22:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:56.378 19:22:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:56.378 19:22:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:56.378 19:22:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:56.378 19:22:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:56.378 19:22:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:56.378 19:22:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:56.378 19:22:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:56.378 19:22:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:56.378 19:22:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:56.378 19:22:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:56.378 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:56.378 19:22:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:56.378 19:22:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:56.378 19:22:16 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:56.378 [2024-11-29 19:22:16.113368] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:56.378 [2024-11-29 19:22:16.113456] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1757519 ] 00:08:56.378 [2024-11-29 19:22:16.192333] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:56.378 [2024-11-29 19:22:16.214453] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:56.637 INFO: Running with entropic power schedule (0xFF, 100). 00:08:56.637 INFO: Seed: 2574674508 00:08:56.637 INFO: Loaded 1 modules (387001 inline 8-bit counters): 387001 [0x2ac06cc, 0x2b1ee85), 00:08:56.637 INFO: Loaded 1 PC tables (387001 PCs): 387001 [0x2b1ee88,0x3106a18), 00:08:56.637 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:56.637 INFO: A corpus is not provided, starting from an empty corpus 00:08:56.637 #2 INITED exec/s: 0 rss: 66Mb 00:08:56.637 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:56.637 This may also happen if the target rejected all inputs we tried so far 00:08:56.637 [2024-11-29 19:22:16.448309] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: enabling controller 00:08:56.637 [2024-11-29 19:22:16.491659] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:56.637 [2024-11-29 19:22:16.491699] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:56.637 [2024-11-29 19:22:16.491738] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:57.154 NEW_FUNC[1/678]: 0x459608 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:71 00:08:57.154 NEW_FUNC[2/678]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:57.154 #17 NEW cov: 11231 ft: 11187 corp: 2/5b lim: 4 exec/s: 0 rss: 72Mb L: 4/4 MS: 5 InsertByte-CopyPart-ChangeByte-ChangeByte-CrossOver- 00:08:57.154 [2024-11-29 19:22:16.962219] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:57.154 [2024-11-29 19:22:16.962252] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:57.154 [2024-11-29 19:22:16.962296] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:57.413 #22 NEW cov: 11248 ft: 14619 corp: 3/9b lim: 4 exec/s: 0 rss: 74Mb L: 4/4 MS: 5 ChangeByte-CrossOver-InsertByte-InsertByte-InsertByte- 00:08:57.413 [2024-11-29 19:22:17.160650] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:57.413 [2024-11-29 19:22:17.160675] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:57.413 [2024-11-29 19:22:17.160694] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:57.413 NEW_FUNC[1/1]: 0x1c31f18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:57.413 #23 NEW cov: 11265 ft: 15174 corp: 4/13b lim: 4 exec/s: 0 rss: 75Mb L: 4/4 MS: 1 ChangeBinInt- 00:08:57.672 [2024-11-29 19:22:17.346426] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:57.672 [2024-11-29 19:22:17.346451] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:57.672 [2024-11-29 19:22:17.346479] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:57.672 #24 NEW cov: 11265 ft: 16334 corp: 5/17b lim: 4 exec/s: 24 rss: 75Mb L: 4/4 MS: 1 ChangeBit- 00:08:57.672 [2024-11-29 19:22:17.527293] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:57.672 [2024-11-29 19:22:17.527316] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:57.672 [2024-11-29 19:22:17.527342] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:57.931 #25 NEW cov: 11265 ft: 16565 corp: 6/21b lim: 4 exec/s: 25 rss: 75Mb L: 4/4 MS: 1 ChangeBit- 00:08:57.931 [2024-11-29 19:22:17.715458] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:57.931 [2024-11-29 19:22:17.715480] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:57.931 [2024-11-29 19:22:17.715498] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:57.931 #26 NEW cov: 11265 ft: 17014 corp: 7/25b lim: 4 exec/s: 26 rss: 75Mb L: 4/4 MS: 1 ShuffleBytes- 00:08:58.190 [2024-11-29 19:22:17.896923] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:58.190 [2024-11-29 19:22:17.896949] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:58.190 [2024-11-29 19:22:17.896966] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:58.190 #27 NEW cov: 11265 ft: 17055 corp: 8/29b lim: 4 exec/s: 27 rss: 75Mb L: 4/4 MS: 1 ChangeBinInt- 00:08:58.190 [2024-11-29 19:22:18.080334] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:58.190 [2024-11-29 19:22:18.080356] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:58.190 [2024-11-29 19:22:18.080373] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:58.449 #28 NEW cov: 11265 ft: 17098 corp: 9/33b lim: 4 exec/s: 28 rss: 75Mb L: 4/4 MS: 1 CrossOver- 00:08:58.449 [2024-11-29 19:22:18.262806] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:58.449 [2024-11-29 19:22:18.262830] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:58.449 [2024-11-29 19:22:18.262847] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:58.707 #29 NEW cov: 11272 ft: 17205 corp: 10/37b lim: 4 exec/s: 29 rss: 75Mb L: 4/4 MS: 1 CopyPart- 00:08:58.707 [2024-11-29 19:22:18.443549] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:58.707 [2024-11-29 19:22:18.443572] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:58.707 [2024-11-29 19:22:18.443590] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:58.707 #30 NEW cov: 11272 ft: 17627 corp: 11/41b lim: 4 exec/s: 15 rss: 75Mb L: 4/4 MS: 1 ShuffleBytes- 00:08:58.707 #30 DONE cov: 11272 ft: 17627 corp: 11/41b lim: 4 exec/s: 15 rss: 75Mb 00:08:58.707 Done 30 runs in 2 second(s) 00:08:58.707 [2024-11-29 19:22:18.571789] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: disabling controller 00:08:58.966 19:22:18 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-1 /var/tmp/suppress_vfio_fuzz 00:08:58.966 19:22:18 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:58.966 19:22:18 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:58.966 19:22:18 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:58.966 19:22:18 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:58.966 19:22:18 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:58.966 19:22:18 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:58.966 19:22:18 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:58.966 19:22:18 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:58.966 19:22:18 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:58.966 19:22:18 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:58.966 19:22:18 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:58.966 19:22:18 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:58.966 19:22:18 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:58.966 19:22:18 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:58.966 19:22:18 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:58.966 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:58.966 19:22:18 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:58.966 19:22:18 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:58.966 19:22:18 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:58.966 [2024-11-29 19:22:18.833095] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:08:58.966 [2024-11-29 19:22:18.833167] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1757876 ] 00:08:59.224 [2024-11-29 19:22:18.913741] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:59.224 [2024-11-29 19:22:18.936188] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:59.224 INFO: Running with entropic power schedule (0xFF, 100). 00:08:59.224 INFO: Seed: 1001728622 00:08:59.483 INFO: Loaded 1 modules (387001 inline 8-bit counters): 387001 [0x2ac06cc, 0x2b1ee85), 00:08:59.483 INFO: Loaded 1 PC tables (387001 PCs): 387001 [0x2b1ee88,0x3106a18), 00:08:59.483 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:59.483 INFO: A corpus is not provided, starting from an empty corpus 00:08:59.483 #2 INITED exec/s: 0 rss: 66Mb 00:08:59.483 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:59.483 This may also happen if the target rejected all inputs we tried so far 00:08:59.483 [2024-11-29 19:22:19.177949] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: enabling controller 00:08:59.483 [2024-11-29 19:22:19.254261] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:00.048 NEW_FUNC[1/677]: 0x459ff8 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:103 00:09:00.048 NEW_FUNC[2/677]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:00.048 #6 NEW cov: 11221 ft: 11182 corp: 2/9b lim: 8 exec/s: 0 rss: 73Mb L: 8/8 MS: 4 InsertRepeatedBytes-ChangeBit-ChangeByte-CopyPart- 00:09:00.048 [2024-11-29 19:22:19.740426] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:00.048 #12 NEW cov: 11235 ft: 14350 corp: 3/17b lim: 8 exec/s: 0 rss: 74Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:09:00.048 [2024-11-29 19:22:19.930700] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:00.306 NEW_FUNC[1/1]: 0x1c31f18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:00.306 #13 NEW cov: 11252 ft: 15584 corp: 4/25b lim: 8 exec/s: 0 rss: 75Mb L: 8/8 MS: 1 ShuffleBytes- 00:09:00.306 [2024-11-29 19:22:20.129189] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:00.564 #19 NEW cov: 11252 ft: 16902 corp: 5/33b lim: 8 exec/s: 19 rss: 75Mb L: 8/8 MS: 1 ChangeBinInt- 00:09:00.564 [2024-11-29 19:22:20.308003] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:00.564 #20 NEW cov: 11252 ft: 17264 corp: 6/41b lim: 8 exec/s: 20 rss: 75Mb L: 8/8 MS: 1 ChangeBinInt- 00:09:00.823 [2024-11-29 19:22:20.491413] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:00.823 #46 NEW cov: 11252 ft: 17508 corp: 7/49b lim: 8 exec/s: 46 rss: 76Mb L: 8/8 MS: 1 ChangeBit- 00:09:00.823 [2024-11-29 19:22:20.671817] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:01.082 #47 NEW cov: 11252 ft: 17774 corp: 8/57b lim: 8 exec/s: 47 rss: 76Mb L: 8/8 MS: 1 CrossOver- 00:09:01.082 [2024-11-29 19:22:20.852928] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:01.082 #48 NEW cov: 11259 ft: 18019 corp: 9/65b lim: 8 exec/s: 48 rss: 76Mb L: 8/8 MS: 1 CopyPart- 00:09:01.340 [2024-11-29 19:22:21.031955] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:01.340 #49 NEW cov: 11259 ft: 18113 corp: 10/73b lim: 8 exec/s: 49 rss: 76Mb L: 8/8 MS: 1 ChangeByte- 00:09:01.340 [2024-11-29 19:22:21.210750] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:01.598 #50 NEW cov: 11259 ft: 18181 corp: 11/81b lim: 8 exec/s: 25 rss: 76Mb L: 8/8 MS: 1 CopyPart- 00:09:01.598 #50 DONE cov: 11259 ft: 18181 corp: 11/81b lim: 8 exec/s: 25 rss: 76Mb 00:09:01.598 Done 50 runs in 2 second(s) 00:09:01.598 [2024-11-29 19:22:21.331793] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: disabling controller 00:09:01.857 19:22:21 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-2 /var/tmp/suppress_vfio_fuzz 00:09:01.857 19:22:21 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:01.857 19:22:21 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:01.857 19:22:21 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:09:01.857 19:22:21 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=3 00:09:01.857 19:22:21 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:01.857 19:22:21 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:01.857 19:22:21 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:01.857 19:22:21 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:09:01.857 19:22:21 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:09:01.857 19:22:21 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:09:01.857 19:22:21 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:09:01.857 19:22:21 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:01.857 19:22:21 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:01.857 19:22:21 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:01.857 19:22:21 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:09:01.857 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:01.857 19:22:21 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:01.857 19:22:21 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:01.857 19:22:21 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:09:01.857 [2024-11-29 19:22:21.591871] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:09:01.857 [2024-11-29 19:22:21.591940] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1758351 ] 00:09:01.857 [2024-11-29 19:22:21.670181] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:01.857 [2024-11-29 19:22:21.692289] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:02.117 INFO: Running with entropic power schedule (0xFF, 100). 00:09:02.117 INFO: Seed: 3762716433 00:09:02.117 INFO: Loaded 1 modules (387001 inline 8-bit counters): 387001 [0x2ac06cc, 0x2b1ee85), 00:09:02.117 INFO: Loaded 1 PC tables (387001 PCs): 387001 [0x2b1ee88,0x3106a18), 00:09:02.117 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:02.117 INFO: A corpus is not provided, starting from an empty corpus 00:09:02.117 #2 INITED exec/s: 0 rss: 66Mb 00:09:02.117 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:02.117 This may also happen if the target rejected all inputs we tried so far 00:09:02.117 [2024-11-29 19:22:21.938594] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: enabling controller 00:09:02.635 NEW_FUNC[1/676]: 0x45a6e8 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:124 00:09:02.635 NEW_FUNC[2/676]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:02.635 #295 NEW cov: 11211 ft: 11198 corp: 2/33b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 3 ChangeBit-InsertRepeatedBytes-InsertRepeatedBytes- 00:09:02.893 NEW_FUNC[1/1]: 0xfb94f8 in spdk_ring_dequeue /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/env.c:445 00:09:02.894 #304 NEW cov: 11243 ft: 14714 corp: 3/65b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 4 InsertRepeatedBytes-InsertRepeatedBytes-ChangeBinInt-InsertByte- 00:09:02.894 NEW_FUNC[1/1]: 0x1c31f18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:02.894 #305 NEW cov: 11260 ft: 15208 corp: 4/97b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 ChangeBit- 00:09:03.152 #306 NEW cov: 11260 ft: 16292 corp: 5/129b lim: 32 exec/s: 306 rss: 74Mb L: 32/32 MS: 1 CopyPart- 00:09:03.411 #307 NEW cov: 11260 ft: 16519 corp: 6/161b lim: 32 exec/s: 307 rss: 74Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:03.411 #308 NEW cov: 11260 ft: 16787 corp: 7/193b lim: 32 exec/s: 308 rss: 74Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:03.670 #309 NEW cov: 11260 ft: 17334 corp: 8/225b lim: 32 exec/s: 309 rss: 74Mb L: 32/32 MS: 1 CopyPart- 00:09:03.928 #310 NEW cov: 11260 ft: 17447 corp: 9/257b lim: 32 exec/s: 310 rss: 75Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:03.928 #311 NEW cov: 11267 ft: 17533 corp: 10/289b lim: 32 exec/s: 311 rss: 75Mb L: 32/32 MS: 1 ChangeByte- 00:09:04.187 #312 NEW cov: 11267 ft: 17755 corp: 11/321b lim: 32 exec/s: 156 rss: 75Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:04.187 #312 DONE cov: 11267 ft: 17755 corp: 11/321b lim: 32 exec/s: 156 rss: 75Mb 00:09:04.187 Done 312 runs in 2 second(s) 00:09:04.187 [2024-11-29 19:22:23.982795] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: disabling controller 00:09:04.447 19:22:24 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-3 /var/tmp/suppress_vfio_fuzz 00:09:04.447 19:22:24 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:04.447 19:22:24 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:04.447 19:22:24 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:09:04.447 19:22:24 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=4 00:09:04.447 19:22:24 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:04.447 19:22:24 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:04.447 19:22:24 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:04.447 19:22:24 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:09:04.447 19:22:24 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:09:04.447 19:22:24 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:09:04.447 19:22:24 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:09:04.447 19:22:24 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:04.447 19:22:24 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:04.447 19:22:24 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:04.447 19:22:24 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:09:04.447 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:04.447 19:22:24 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:04.447 19:22:24 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:04.447 19:22:24 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:09:04.447 [2024-11-29 19:22:24.240685] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:09:04.447 [2024-11-29 19:22:24.240757] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1758886 ] 00:09:04.447 [2024-11-29 19:22:24.318022] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:04.447 [2024-11-29 19:22:24.340188] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:04.731 INFO: Running with entropic power schedule (0xFF, 100). 00:09:04.731 INFO: Seed: 2115762179 00:09:04.731 INFO: Loaded 1 modules (387001 inline 8-bit counters): 387001 [0x2ac06cc, 0x2b1ee85), 00:09:04.731 INFO: Loaded 1 PC tables (387001 PCs): 387001 [0x2b1ee88,0x3106a18), 00:09:04.731 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:04.731 INFO: A corpus is not provided, starting from an empty corpus 00:09:04.731 #2 INITED exec/s: 0 rss: 66Mb 00:09:04.731 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:04.731 This may also happen if the target rejected all inputs we tried so far 00:09:04.731 [2024-11-29 19:22:24.579156] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: enabling controller 00:09:05.249 NEW_FUNC[1/677]: 0x45af68 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:144 00:09:05.249 NEW_FUNC[2/677]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:05.249 #44 NEW cov: 11224 ft: 11188 corp: 2/33b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 2 CopyPart-InsertRepeatedBytes- 00:09:05.507 #50 NEW cov: 11238 ft: 14953 corp: 3/65b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:05.507 NEW_FUNC[1/1]: 0x1c31f18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:05.507 #61 NEW cov: 11258 ft: 15772 corp: 4/97b lim: 32 exec/s: 0 rss: 75Mb L: 32/32 MS: 1 CrossOver- 00:09:05.765 #62 NEW cov: 11258 ft: 16623 corp: 5/129b lim: 32 exec/s: 62 rss: 75Mb L: 32/32 MS: 1 CopyPart- 00:09:06.025 #68 NEW cov: 11258 ft: 17407 corp: 6/161b lim: 32 exec/s: 68 rss: 75Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:06.283 #69 NEW cov: 11258 ft: 17506 corp: 7/193b lim: 32 exec/s: 69 rss: 75Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:06.283 #70 NEW cov: 11258 ft: 17668 corp: 8/225b lim: 32 exec/s: 70 rss: 75Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:06.542 #71 NEW cov: 11258 ft: 18124 corp: 9/257b lim: 32 exec/s: 71 rss: 75Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:06.802 #72 NEW cov: 11265 ft: 18184 corp: 10/289b lim: 32 exec/s: 72 rss: 75Mb L: 32/32 MS: 1 ChangeByte- 00:09:06.802 #73 NEW cov: 11265 ft: 18507 corp: 11/321b lim: 32 exec/s: 36 rss: 75Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:06.802 #73 DONE cov: 11265 ft: 18507 corp: 11/321b lim: 32 exec/s: 36 rss: 75Mb 00:09:06.802 Done 73 runs in 2 second(s) 00:09:06.802 [2024-11-29 19:22:26.657793] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: disabling controller 00:09:07.061 19:22:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-4 /var/tmp/suppress_vfio_fuzz 00:09:07.061 19:22:26 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:07.061 19:22:26 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:07.061 19:22:26 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:09:07.061 19:22:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=5 00:09:07.061 19:22:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:07.061 19:22:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:07.061 19:22:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:07.061 19:22:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:09:07.061 19:22:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:09:07.061 19:22:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:09:07.061 19:22:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:09:07.061 19:22:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:07.061 19:22:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:07.061 19:22:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:07.061 19:22:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:09:07.061 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:07.061 19:22:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:07.061 19:22:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:07.061 19:22:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:09:07.061 [2024-11-29 19:22:26.918461] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:09:07.062 [2024-11-29 19:22:26.918556] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1759392 ] 00:09:07.321 [2024-11-29 19:22:26.999181] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:07.321 [2024-11-29 19:22:27.021740] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:07.321 INFO: Running with entropic power schedule (0xFF, 100). 00:09:07.321 INFO: Seed: 498779133 00:09:07.322 INFO: Loaded 1 modules (387001 inline 8-bit counters): 387001 [0x2ac06cc, 0x2b1ee85), 00:09:07.322 INFO: Loaded 1 PC tables (387001 PCs): 387001 [0x2b1ee88,0x3106a18), 00:09:07.322 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:07.322 INFO: A corpus is not provided, starting from an empty corpus 00:09:07.322 #2 INITED exec/s: 0 rss: 66Mb 00:09:07.322 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:07.322 This may also happen if the target rejected all inputs we tried so far 00:09:07.581 [2024-11-29 19:22:27.257732] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: enabling controller 00:09:07.581 [2024-11-29 19:22:27.302618] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:07.581 [2024-11-29 19:22:27.302651] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:07.841 NEW_FUNC[1/678]: 0x45b968 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:171 00:09:07.841 NEW_FUNC[2/678]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:07.841 #72 NEW cov: 11237 ft: 11197 corp: 2/14b lim: 13 exec/s: 0 rss: 73Mb L: 13/13 MS: 5 ChangeByte-ChangeByte-InsertRepeatedBytes-CopyPart-InsertByte- 00:09:08.101 [2024-11-29 19:22:27.755698] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:08.101 [2024-11-29 19:22:27.755740] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:08.101 #83 NEW cov: 11254 ft: 14021 corp: 3/27b lim: 13 exec/s: 0 rss: 74Mb L: 13/13 MS: 1 CopyPart- 00:09:08.101 [2024-11-29 19:22:27.931455] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:08.101 [2024-11-29 19:22:27.931490] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:08.360 NEW_FUNC[1/1]: 0x1c31f18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:08.360 #84 NEW cov: 11271 ft: 14775 corp: 4/40b lim: 13 exec/s: 0 rss: 75Mb L: 13/13 MS: 1 ChangeBit- 00:09:08.360 [2024-11-29 19:22:28.107291] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:08.360 [2024-11-29 19:22:28.107324] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:08.360 #85 NEW cov: 11271 ft: 14947 corp: 5/53b lim: 13 exec/s: 0 rss: 76Mb L: 13/13 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:09:08.619 [2024-11-29 19:22:28.284988] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:08.619 [2024-11-29 19:22:28.285019] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:08.619 #86 NEW cov: 11271 ft: 15238 corp: 6/66b lim: 13 exec/s: 86 rss: 76Mb L: 13/13 MS: 1 ShuffleBytes- 00:09:08.619 [2024-11-29 19:22:28.457202] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:08.619 [2024-11-29 19:22:28.457232] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:08.879 #87 NEW cov: 11271 ft: 16836 corp: 7/79b lim: 13 exec/s: 87 rss: 76Mb L: 13/13 MS: 1 ChangeByte- 00:09:08.879 [2024-11-29 19:22:28.630717] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:08.879 [2024-11-29 19:22:28.630749] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:08.879 #93 NEW cov: 11271 ft: 16888 corp: 8/92b lim: 13 exec/s: 93 rss: 76Mb L: 13/13 MS: 1 ChangeBinInt- 00:09:09.138 [2024-11-29 19:22:28.805761] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:09.138 [2024-11-29 19:22:28.805792] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:09.138 #94 NEW cov: 11271 ft: 17336 corp: 9/105b lim: 13 exec/s: 94 rss: 76Mb L: 13/13 MS: 1 CMP- DE: "\001\000\000\000\004\345\355\324"- 00:09:09.138 [2024-11-29 19:22:28.987549] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:09.138 [2024-11-29 19:22:28.987580] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:09.397 #95 NEW cov: 11278 ft: 17371 corp: 10/118b lim: 13 exec/s: 95 rss: 76Mb L: 13/13 MS: 1 ChangeByte- 00:09:09.397 [2024-11-29 19:22:29.159053] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:09.397 [2024-11-29 19:22:29.159083] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:09.397 #96 NEW cov: 11278 ft: 17798 corp: 11/131b lim: 13 exec/s: 48 rss: 76Mb L: 13/13 MS: 1 ShuffleBytes- 00:09:09.397 #96 DONE cov: 11278 ft: 17798 corp: 11/131b lim: 13 exec/s: 48 rss: 76Mb 00:09:09.397 ###### Recommended dictionary. ###### 00:09:09.397 "\001\000\000\000\000\000\000\000" # Uses: 1 00:09:09.397 "\001\000\000\000\004\345\355\324" # Uses: 0 00:09:09.397 ###### End of recommended dictionary. ###### 00:09:09.397 Done 96 runs in 2 second(s) 00:09:09.397 [2024-11-29 19:22:29.277791] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: disabling controller 00:09:09.656 19:22:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-5 /var/tmp/suppress_vfio_fuzz 00:09:09.656 19:22:29 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:09.656 19:22:29 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:09.656 19:22:29 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:09:09.656 19:22:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=6 00:09:09.656 19:22:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:09.656 19:22:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:09.656 19:22:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:09.656 19:22:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:09:09.656 19:22:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:09:09.656 19:22:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:09:09.656 19:22:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:09:09.656 19:22:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:09.656 19:22:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:09.656 19:22:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:09.656 19:22:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:09:09.657 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:09.657 19:22:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:09.657 19:22:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:09.657 19:22:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:09:09.657 [2024-11-29 19:22:29.535717] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 23.11.0 initialization... 00:09:09.657 [2024-11-29 19:22:29.535784] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1759711 ] 00:09:09.915 [2024-11-29 19:22:29.616579] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:09.915 [2024-11-29 19:22:29.639022] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:10.174 INFO: Running with entropic power schedule (0xFF, 100). 00:09:10.174 INFO: Seed: 3121812093 00:09:10.174 INFO: Loaded 1 modules (387001 inline 8-bit counters): 387001 [0x2ac06cc, 0x2b1ee85), 00:09:10.174 INFO: Loaded 1 PC tables (387001 PCs): 387001 [0x2b1ee88,0x3106a18), 00:09:10.174 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:10.174 INFO: A corpus is not provided, starting from an empty corpus 00:09:10.174 #2 INITED exec/s: 0 rss: 66Mb 00:09:10.174 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:10.174 This may also happen if the target rejected all inputs we tried so far 00:09:10.174 [2024-11-29 19:22:29.880593] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: enabling controller 00:09:10.174 [2024-11-29 19:22:29.951299] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:10.174 [2024-11-29 19:22:29.951335] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:10.433 NEW_FUNC[1/678]: 0x45c658 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:09:10.433 NEW_FUNC[2/678]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:10.433 #25 NEW cov: 11232 ft: 10720 corp: 2/10b lim: 9 exec/s: 0 rss: 73Mb L: 9/9 MS: 3 ShuffleBytes-InsertRepeatedBytes-InsertRepeatedBytes- 00:09:10.693 [2024-11-29 19:22:30.421918] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:10.693 [2024-11-29 19:22:30.421965] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:10.693 #26 NEW cov: 11246 ft: 14478 corp: 3/19b lim: 9 exec/s: 0 rss: 74Mb L: 9/9 MS: 1 ChangeBinInt- 00:09:10.953 [2024-11-29 19:22:30.620028] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:10.953 [2024-11-29 19:22:30.620061] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:10.953 NEW_FUNC[1/1]: 0x1c31f18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:10.953 #27 NEW cov: 11263 ft: 15618 corp: 4/28b lim: 9 exec/s: 0 rss: 75Mb L: 9/9 MS: 1 CrossOver- 00:09:10.953 [2024-11-29 19:22:30.812894] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:10.953 [2024-11-29 19:22:30.812925] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.212 #28 NEW cov: 11263 ft: 15795 corp: 5/37b lim: 9 exec/s: 28 rss: 75Mb L: 9/9 MS: 1 CMP- DE: "f\000\000\000\000\000\000\000"- 00:09:11.212 [2024-11-29 19:22:30.985380] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.212 [2024-11-29 19:22:30.985411] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.212 #39 NEW cov: 11263 ft: 16299 corp: 6/46b lim: 9 exec/s: 39 rss: 75Mb L: 9/9 MS: 1 ChangeByte- 00:09:11.472 [2024-11-29 19:22:31.166580] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.472 [2024-11-29 19:22:31.166619] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.472 #45 NEW cov: 11263 ft: 17206 corp: 7/55b lim: 9 exec/s: 45 rss: 75Mb L: 9/9 MS: 1 CopyPart- 00:09:11.472 [2024-11-29 19:22:31.345637] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.472 [2024-11-29 19:22:31.345667] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.731 #46 NEW cov: 11263 ft: 17239 corp: 8/64b lim: 9 exec/s: 46 rss: 75Mb L: 9/9 MS: 1 ChangeBit- 00:09:11.731 [2024-11-29 19:22:31.527584] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.731 [2024-11-29 19:22:31.527622] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.731 #47 NEW cov: 11263 ft: 17497 corp: 9/73b lim: 9 exec/s: 47 rss: 76Mb L: 9/9 MS: 1 ShuffleBytes- 00:09:11.990 [2024-11-29 19:22:31.700000] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.990 [2024-11-29 19:22:31.700030] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.990 #48 NEW cov: 11270 ft: 17573 corp: 10/82b lim: 9 exec/s: 48 rss: 76Mb L: 9/9 MS: 1 CrossOver- 00:09:11.990 [2024-11-29 19:22:31.872574] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.990 [2024-11-29 19:22:31.872610] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.250 #49 NEW cov: 11270 ft: 17816 corp: 11/91b lim: 9 exec/s: 24 rss: 76Mb L: 9/9 MS: 1 ChangeByte- 00:09:12.250 #49 DONE cov: 11270 ft: 17816 corp: 11/91b lim: 9 exec/s: 24 rss: 76Mb 00:09:12.250 ###### Recommended dictionary. ###### 00:09:12.250 "f\000\000\000\000\000\000\000" # Uses: 0 00:09:12.250 ###### End of recommended dictionary. ###### 00:09:12.250 Done 49 runs in 2 second(s) 00:09:12.250 [2024-11-29 19:22:31.990793] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: disabling controller 00:09:12.509 19:22:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-6 /var/tmp/suppress_vfio_fuzz 00:09:12.509 19:22:32 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:12.509 19:22:32 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:12.509 19:22:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@84 -- # trap - SIGINT SIGTERM EXIT 00:09:12.509 00:09:12.509 real 0m19.142s 00:09:12.509 user 0m27.168s 00:09:12.509 sys 0m1.830s 00:09:12.509 19:22:32 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:12.509 19:22:32 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:12.509 ************************************ 00:09:12.509 END TEST vfio_llvm_fuzz 00:09:12.509 ************************************ 00:09:12.509 00:09:12.509 real 1m21.920s 00:09:12.509 user 2m6.309s 00:09:12.509 sys 0m9.280s 00:09:12.509 19:22:32 llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:12.509 19:22:32 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:12.509 ************************************ 00:09:12.509 END TEST llvm_fuzz 00:09:12.509 ************************************ 00:09:12.509 19:22:32 -- spdk/autotest.sh@378 -- # [[ '' -eq 1 ]] 00:09:12.509 19:22:32 -- spdk/autotest.sh@385 -- # trap - SIGINT SIGTERM EXIT 00:09:12.509 19:22:32 -- spdk/autotest.sh@387 -- # timing_enter post_cleanup 00:09:12.509 19:22:32 -- common/autotest_common.sh@726 -- # xtrace_disable 00:09:12.509 19:22:32 -- common/autotest_common.sh@10 -- # set +x 00:09:12.509 19:22:32 -- spdk/autotest.sh@388 -- # autotest_cleanup 00:09:12.509 19:22:32 -- common/autotest_common.sh@1396 -- # local autotest_es=0 00:09:12.509 19:22:32 -- common/autotest_common.sh@1397 -- # xtrace_disable 00:09:12.509 19:22:32 -- common/autotest_common.sh@10 -- # set +x 00:09:19.079 INFO: APP EXITING 00:09:19.079 INFO: killing all VMs 00:09:19.079 INFO: killing vhost app 00:09:19.079 INFO: EXIT DONE 00:09:22.372 Waiting for block devices as requested 00:09:22.372 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:22.372 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:22.372 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:22.372 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:22.632 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:22.632 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:22.632 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:22.632 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:22.893 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:22.893 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:22.893 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:23.152 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:23.152 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:23.152 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:23.412 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:23.412 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:23.412 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:26.912 Cleaning 00:09:26.912 Removing: /dev/shm/spdk_tgt_trace.pid1732529 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1730034 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1731135 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1732529 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1732985 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1734018 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1734090 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1735199 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1735205 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1735639 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1735967 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1736294 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1736501 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1736709 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1736995 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1737275 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1737593 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1738189 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1741351 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1741537 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1741699 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1741847 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1742254 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1742260 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1742825 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1742831 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1743215 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1743362 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1743552 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1743627 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1744582 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1744907 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1745131 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1745271 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1746018 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1746315 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1746842 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1747230 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1747663 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1748184 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1748485 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1749022 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1749374 00:09:26.912 Removing: /var/run/dpdk/spdk_pid1749842 00:09:27.171 Removing: /var/run/dpdk/spdk_pid1750319 00:09:27.171 Removing: /var/run/dpdk/spdk_pid1750659 00:09:27.171 Removing: /var/run/dpdk/spdk_pid1751193 00:09:27.171 Removing: /var/run/dpdk/spdk_pid1751533 00:09:27.171 Removing: /var/run/dpdk/spdk_pid1752012 00:09:27.171 Removing: /var/run/dpdk/spdk_pid1752537 00:09:27.171 Removing: /var/run/dpdk/spdk_pid1752833 00:09:27.171 Removing: /var/run/dpdk/spdk_pid1753362 00:09:27.171 Removing: /var/run/dpdk/spdk_pid1753788 00:09:27.171 Removing: /var/run/dpdk/spdk_pid1754186 00:09:27.171 Removing: /var/run/dpdk/spdk_pid1754722 00:09:27.171 Removing: /var/run/dpdk/spdk_pid1755010 00:09:27.171 Removing: /var/run/dpdk/spdk_pid1755538 00:09:27.171 Removing: /var/run/dpdk/spdk_pid1755956 00:09:27.171 Removing: /var/run/dpdk/spdk_pid1756364 00:09:27.171 Removing: /var/run/dpdk/spdk_pid1756986 00:09:27.172 Removing: /var/run/dpdk/spdk_pid1757519 00:09:27.172 Removing: /var/run/dpdk/spdk_pid1757876 00:09:27.172 Removing: /var/run/dpdk/spdk_pid1758351 00:09:27.172 Removing: /var/run/dpdk/spdk_pid1758886 00:09:27.172 Removing: /var/run/dpdk/spdk_pid1759392 00:09:27.172 Removing: /var/run/dpdk/spdk_pid1759711 00:09:27.172 Clean 00:09:27.172 19:22:47 -- common/autotest_common.sh@1453 -- # return 0 00:09:27.172 19:22:47 -- spdk/autotest.sh@389 -- # timing_exit post_cleanup 00:09:27.172 19:22:47 -- common/autotest_common.sh@732 -- # xtrace_disable 00:09:27.172 19:22:47 -- common/autotest_common.sh@10 -- # set +x 00:09:27.172 19:22:47 -- spdk/autotest.sh@391 -- # timing_exit autotest 00:09:27.172 19:22:47 -- common/autotest_common.sh@732 -- # xtrace_disable 00:09:27.172 19:22:47 -- common/autotest_common.sh@10 -- # set +x 00:09:27.431 19:22:47 -- spdk/autotest.sh@392 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:27.431 19:22:47 -- spdk/autotest.sh@394 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:27.431 19:22:47 -- spdk/autotest.sh@394 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:27.431 19:22:47 -- spdk/autotest.sh@396 -- # [[ y == y ]] 00:09:27.431 19:22:47 -- spdk/autotest.sh@398 -- # hostname 00:09:27.431 19:22:47 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:09:27.431 geninfo: WARNING: invalid characters removed from testname! 00:09:34.027 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcda 00:09:34.027 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcda 00:09:40.598 19:22:59 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:47.168 19:23:06 -- spdk/autotest.sh@400 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:52.444 19:23:12 -- spdk/autotest.sh@404 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:57.718 19:23:17 -- spdk/autotest.sh@405 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:02.992 19:23:22 -- spdk/autotest.sh@406 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:08.266 19:23:27 -- spdk/autotest.sh@407 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:13.539 19:23:33 -- spdk/autotest.sh@408 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:10:13.539 19:23:33 -- spdk/autorun.sh@1 -- $ timing_finish 00:10:13.539 19:23:33 -- common/autotest_common.sh@738 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt ]] 00:10:13.539 19:23:33 -- common/autotest_common.sh@740 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:10:13.539 19:23:33 -- common/autotest_common.sh@741 -- $ [[ -x /usr/local/FlameGraph/flamegraph.pl ]] 00:10:13.539 19:23:33 -- common/autotest_common.sh@744 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:10:13.539 + [[ -n 1605234 ]] 00:10:13.539 + sudo kill 1605234 00:10:13.549 [Pipeline] } 00:10:13.566 [Pipeline] // stage 00:10:13.571 [Pipeline] } 00:10:13.586 [Pipeline] // timeout 00:10:13.592 [Pipeline] } 00:10:13.608 [Pipeline] // catchError 00:10:13.614 [Pipeline] } 00:10:13.630 [Pipeline] // wrap 00:10:13.636 [Pipeline] } 00:10:13.648 [Pipeline] // catchError 00:10:13.657 [Pipeline] stage 00:10:13.659 [Pipeline] { (Epilogue) 00:10:13.670 [Pipeline] catchError 00:10:13.671 [Pipeline] { 00:10:13.684 [Pipeline] echo 00:10:13.686 Cleanup processes 00:10:13.694 [Pipeline] sh 00:10:13.984 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:13.984 1768357 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:13.999 [Pipeline] sh 00:10:14.284 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:14.284 ++ grep -v 'sudo pgrep' 00:10:14.284 ++ awk '{print $1}' 00:10:14.284 + sudo kill -9 00:10:14.284 + true 00:10:14.296 [Pipeline] sh 00:10:14.580 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:10:14.580 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:14.581 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:15.957 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:28.175 [Pipeline] sh 00:10:28.462 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:10:28.462 Artifacts sizes are good 00:10:28.479 [Pipeline] archiveArtifacts 00:10:28.487 Archiving artifacts 00:10:28.699 [Pipeline] sh 00:10:29.060 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:10:29.078 [Pipeline] cleanWs 00:10:29.090 [WS-CLEANUP] Deleting project workspace... 00:10:29.091 [WS-CLEANUP] Deferred wipeout is used... 00:10:29.097 [WS-CLEANUP] done 00:10:29.099 [Pipeline] } 00:10:29.119 [Pipeline] // catchError 00:10:29.135 [Pipeline] sh 00:10:29.423 + logger -p user.info -t JENKINS-CI 00:10:29.432 [Pipeline] } 00:10:29.448 [Pipeline] // stage 00:10:29.453 [Pipeline] } 00:10:29.469 [Pipeline] // node 00:10:29.475 [Pipeline] End of Pipeline 00:10:29.518 Finished: SUCCESS